Astra 数据库聊天内存
¥Astra DB Chat Memory
为了在聊天会话中实现更长期的持久化,你可以将支持聊天内存类(如 BufferMemory)的默认内存 chatHistory 替换为 Astra DB。
¥For longer-term persistence across chat sessions, you can swap out the default in-memory chatHistory that backs chat memory classes like BufferMemory for Astra DB.
设置
¥Setup
你需要安装 Astra DB TS 客户端:
¥You need to install the Astra DB TS client:
- npm
- Yarn
- pnpm
npm install @datastax/astra-db-ts
yarn add @datastax/astra-db-ts
pnpm add @datastax/astra-db-ts
- npm
- Yarn
- pnpm
npm install @langchain/openai @langchain/community @langchain/core
yarn add @langchain/openai @langchain/community @langchain/core
pnpm add @langchain/openai @langchain/community @langchain/core
配置和初始化
¥Configuration and Initalization
初始化 AstraDBChatMessageHistory 有两种方法
¥There are two ways to inialize your AstraDBChatMessageHistory
如果你已经定义了 AstraDB 客户端实例,你可以连接到你的集合并使用构造函数初始化 ChatMessageHistory 实例。
¥If you already have an instance of the AstraDB client defined you can connect to your collection and initialize an instance of the ChatMessageHistory using the constuctor.
const client = (client = new AstraDB(
process.env.ASTRA_DB_APPLICATION_TOKEN,
process.env.ASTRA_DB_ENDPOINT,
process.env.ASTRA_DB_NAMESPACE
));
const collection = await client.collection("YOUR_COLLECTION_NAME");
const chatHistory = new AstraDBChatMessageHistory({
collection,
sessionId: "YOUR_SESSION_ID",
});
如果你尚无 AstraDB 客户端实例,则可以使用 initialize 方法。
¥If you don't already have an instance of an AstraDB client you can use the initialize method.
const chatHistory = await AstraDBChatMessageHistory.initialize({
token: process.env.ASTRA_DB_APPLICATION_TOKEN ?? "token",
endpoint: process.env.ASTRA_DB_ENDPOINT ?? "endpoint",
namespace: process.env.ASTRA_DB_NAMESPACE,
collectionName: "YOUR_COLLECTION_NAME",
sessionId: "YOUR_SESSION_ID",
});
用法
¥Usage
你的集合必须已经存在
¥Your collection must already exist
import { RunnableWithMessageHistory } from "@langchain/core/runnables";
import {
ChatPromptTemplate,
MessagesPlaceholder,
} from "@langchain/core/prompts";
import { StringOutputParser } from "@langchain/core/output_parsers";
import { ChatOpenAI } from "@langchain/openai";
import { AstraDBChatMessageHistory } from "@langchain/community/stores/message/astradb";
const model = new ChatOpenAI({
model: "gpt-3.5-turbo",
temperature: 0,
});
const prompt = ChatPromptTemplate.fromMessages([
[
"system",
"You are a helpful assistant. Answer all questions to the best of your ability.",
],
new MessagesPlaceholder("chat_history"),
["human", "{input}"],
]);
const chain = prompt.pipe(model).pipe(new StringOutputParser());
const chainWithHistory = new RunnableWithMessageHistory({
runnable: chain,
inputMessagesKey: "input",
historyMessagesKey: "chat_history",
getMessageHistory: async (sessionId) => {
const chatHistory = await AstraDBChatMessageHistory.initialize({
token: process.env.ASTRA_DB_APPLICATION_TOKEN as string,
endpoint: process.env.ASTRA_DB_ENDPOINT as string,
namespace: process.env.ASTRA_DB_NAMESPACE,
collectionName: "YOUR_COLLECTION_NAME",
sessionId,
});
return chatHistory;
},
});
const res1 = await chainWithHistory.invoke(
{
input: "Hi! I'm Jim.",
},
{ configurable: { sessionId: "langchain-test-session" } }
);
console.log({ res1 });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
}
}
*/
const res2 = await chainWithHistory.invoke(
{ input: "What did I just say my name was?" },
{ configurable: { sessionId: "langchain-test-session" } }
);
console.log({ res2 });
/*
{
res2: {
text: "You said your name was Jim."
}
}
*/
API Reference:
- RunnableWithMessageHistory from
@langchain/core/runnables - ChatPromptTemplate from
@langchain/core/prompts - MessagesPlaceholder from
@langchain/core/prompts - StringOutputParser from
@langchain/core/output_parsers - ChatOpenAI from
@langchain/openai - AstraDBChatMessageHistory from
@langchain/community/stores/message/astradb