Skip to main content

Convex 聊天内存

¥Convex Chat Memory

为了在聊天会话中实现更长期的持久化,你可以将支持聊天内存类(如 BufferMemory)的默认内存 chatHistory 替换为 Convex

¥For longer-term persistence across chat sessions, you can swap out the default in-memory chatHistory that backs chat memory classes like BufferMemory for Convex.

设置

¥Setup

创建项目

¥Create project

设置一个可运行的 Convex 项目,例如使用以下方法:

¥Get a working Convex project set up, for example by using:

npm create convex@latest

添加数据库访问器

¥Add database accessors

convex/langchain/db.ts 添加查询和变异助手:

¥Add query and mutation helpers to convex/langchain/db.ts:

convex/langchain/db.ts
export * from "@langchain/community/utils/convex";

配置你的模式

¥Configure your schema

设置模式(用于索引):

¥Set up your schema (for indexing):

convex/schema.ts
import { defineSchema, defineTable } from "convex/server";
import { v } from "convex/values";

export default defineSchema({
messages: defineTable({
sessionId: v.string(),
message: v.object({
type: v.string(),
data: v.object({
content: v.string(),
role: v.optional(v.string()),
name: v.optional(v.string()),
additional_kwargs: v.optional(v.any()),
}),
}),
}).index("bySessionId", ["sessionId"]),
});

用法

¥Usage

存储在 Convex 中的每个聊天历史记录会话都必须具有唯一的会话 ID。

¥Each chat history session stored in Convex must have a unique session id.

npm install @langchain/openai @langchain/community @langchain/core
convex/myActions.ts
"use node";

import { v } from "convex/values";
import { BufferMemory } from "langchain/memory";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";
import { ConvexChatMessageHistory } from "@langchain/community/stores/message/convex";
import { action } from "./_generated/server.js";

export const ask = action({
args: { sessionId: v.string() },
handler: async (ctx, args) => {
// pass in a sessionId string
const { sessionId } = args;

const memory = new BufferMemory({
chatHistory: new ConvexChatMessageHistory({ sessionId, ctx }),
});

const model = new ChatOpenAI({
model: "gpt-3.5-turbo",
temperature: 0,
});

const chain = new ConversationChain({ llm: model, memory });

const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
}
}
*/

const res2 = await chain.invoke({
input: "What did I just say my name was?",
});
console.log({ res2 });

/*
{
res2: {
text: "You said your name was Jim."
}
}
*/

// See the chat history in the Convex database
console.log(await memory.chatHistory.getMessages());

// clear chat history
await memory.chatHistory.clear();
},
});

API Reference: