Chat Prompt Templates¶
The Cassandra-specific approach can be seamlessly integrated with LangChain's "chat prompt templates".
In [1]:
Copied!
from langchain.prompts import createCassandraPromptTemplate
from langchain.prompts import createCassandraPromptTemplate
In [2]:
Copied!
from cqlsession import getCQLSession, getCQLKeyspace
cqlMode = 'astra_db' # 'astra_db'/'local'
session = getCQLSession(mode=cqlMode)
keyspace = getCQLKeyspace(mode=cqlMode)
from cqlsession import getCQLSession, getCQLKeyspace
cqlMode = 'astra_db' # 'astra_db'/'local'
session = getCQLSession(mode=cqlMode)
keyspace = getCQLKeyspace(mode=cqlMode)
This is the prompt for a single message in the chat sequence.
We create it similarly as for a "stand-alone Cassandra prompt template".
In [3]:
Copied!
systemTemplate = """
You are a chat assistant, helping a user of age {user_age} from a city
they refer to as {city_nickname}.
"""
systemTemplate = """
You are a chat assistant, helping a user of age {user_age} from a city
they refer to as {city_nickname}.
"""
In [4]:
Copied!
cassSystemPrompt = createCassandraPromptTemplate(
session=session,
keyspace=keyspace,
template=systemTemplate,
input_variables=['city', 'name'],
field_mapper={
'user_age': ('people', 'age'),
'city_nickname': ('nickname_by_city', 'nickname'),
},
)
cassSystemPrompt = createCassandraPromptTemplate(
session=session,
keyspace=keyspace,
template=systemTemplate,
input_variables=['city', 'name'],
field_mapper={
'user_age': ('people', 'age'),
'city_nickname': ('nickname_by_city', 'nickname'),
},
)
In [5]:
Copied!
from langchain.prompts import (
ChatPromptTemplate,
SystemMessagePromptTemplate,
HumanMessagePromptTemplate,
)
systemMessagePrompt = SystemMessagePromptTemplate(prompt=cassSystemPrompt)
from langchain.prompts import (
ChatPromptTemplate,
SystemMessagePromptTemplate,
HumanMessagePromptTemplate,
)
systemMessagePrompt = SystemMessagePromptTemplate(prompt=cassSystemPrompt)
A sequence of messages¶
Once we wrapped a single prompt template as a "system message prompt", let's make it part of a longer chat conversation:
In [6]:
Copied!
humanTemplate = "{text}"
humanMessagePrompt = HumanMessagePromptTemplate.from_template(humanTemplate)
humanTemplate = "{text}"
humanMessagePrompt = HumanMessagePromptTemplate.from_template(humanTemplate)
In [7]:
Copied!
cassChatPrompt = ChatPromptTemplate.from_messages(
[systemMessagePrompt, humanMessagePrompt]
)
cassChatPrompt = ChatPromptTemplate.from_messages(
[systemMessagePrompt, humanMessagePrompt]
)
Rendering¶
LangChain takes care of correctly propagating the rendering steps throughout the sequence of messages, including the Cassandra-backed template:
In [8]:
Copied!
print(cassChatPrompt.format_prompt(
city='turin',
name='beppe',
text='Assistant, please help me!'
).to_string())
print(cassChatPrompt.format_prompt(
city='turin',
name='beppe',
text='Assistant, please help me!'
).to_string())
System: You are a chat assistant, helping a user of age 2 from a city they refer to as CereaNeh. Human: Assistant, please help me!