Skip to main content

Xata 聊天记忆

¥Xata Chat Memory

Xata 是一个基于 PostgreSQL 的无服务器数据平台。它提供了一个类型安全的 TypeScript/JavaScript SDK 用于与数据库交互,以及一个用于管理数据的 UI。

¥Xata is a serverless data platform, based on PostgreSQL. It provides a type-safe TypeScript/JavaScript SDK for interacting with your database, and a UI for managing your data.

使用 XataChatMessageHistory 类,你可以使用 Xata 数据库来长期保存聊天会话。

¥With the XataChatMessageHistory class, you can use Xata databases for longer-term persistence of chat sessions.

由于 Xata 通过 REST API 工作并具有纯 TypeScript SDK,因此你可以将其用于 Vercel EdgeCloudflare Workers 和任何其他无服务器环境。

¥Because Xata works via a REST API and has a pure TypeScript SDK, you can use this with Vercel Edge, Cloudflare Workers and any other Serverless environment.

设置

¥Setup

安装 Xata CLI

¥Install the Xata CLI

npm install @xata.io/cli -g

创建一个用作向量存储的数据库

¥Create a database to be used as a vector store

Xata 用户界面 中创建一个新的数据库。你可以随意命名,但在本例中我们将使用 langchain

¥In the Xata UI create a new database. You can name it whatever you want, but for this example we'll use langchain.

首次执行时,Xata LangChain 集成将创建用于存储聊天消息的表。如果同名的表已经存在,它将保持不变。

¥When executed for the first time, the Xata LangChain integration will create the table used for storing the chat messages. If a table with that name already exists, it will be left untouched.

初始化项目

¥Initialize the project

在你的项目中,运行:

¥In your project, run:

xata init

然后选择你上面创建的数据库。这还将生成一个 xata.tsxata.js 文件,用于定义可用于与数据库交互的客户端。有关使用 Xata JavaScript/TypeScript SDK 的更多详细信息,请参阅 Xata 入门文档

¥and then choose the database you created above. This will also generate a xata.ts or xata.js file that defines the client you can use to interact with the database. See the Xata getting started docs for more details on using the Xata JavaScript/TypeScript SDK.

用法

¥Usage

存储在 Xata 数据库中的每个聊天历史记录会话都必须具有唯一的 ID。

¥Each chat history session stored in Xata database must have a unique id.

在本例中,getXataClient() 函数用于根据环境变量创建一个新的 Xata 客户端。但是,我们建议使用 xata init 命令生成的代码,在这种情况下,你只需从生成的 xata.ts 文件中导入 getXataClient() 函数即可。

¥In this example, the getXataClient() function is used to create a new Xata client based on the environment variables. However, we recommend using the code generated by the xata init command, in which case you only need to import the getXataClient() function from the generated xata.ts file.

npm install @langchain/openai @langchain/community @langchain/core
import { BufferMemory } from "langchain/memory";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";
import { XataChatMessageHistory } from "@langchain/community/stores/message/xata";
import { BaseClient } from "@xata.io/client";

// if you use the generated client, you don't need this function.
// Just import getXataClient from the generated xata.ts instead.
const getXataClient = () => {
if (!process.env.XATA_API_KEY) {
throw new Error("XATA_API_KEY not set");
}

if (!process.env.XATA_DB_URL) {
throw new Error("XATA_DB_URL not set");
}
const xata = new BaseClient({
databaseURL: process.env.XATA_DB_URL,
apiKey: process.env.XATA_API_KEY,
branch: process.env.XATA_BRANCH || "main",
});
return xata;
};

const memory = new BufferMemory({
chatHistory: new XataChatMessageHistory({
table: "messages",
sessionId: new Date().toISOString(), // Or some other unique identifier for the conversation
client: getXataClient(),
apiKey: process.env.XATA_API_KEY, // The API key is needed for creating the table.
}),
});

const model = new ChatOpenAI();
const chain = new ConversationChain({ llm: model, memory });

const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
}
}
*/

const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });

/*
{
res1: {
text: "You said your name was Jim."
}
}
*/

API Reference:

包含预创建表格

¥With pre-created table

如果你不希望代码始终检查表是否存在,你可以在 Xata UI 中手动创建表,并将 createTable: false 传递给构造函数。该表必须包含以下列:

¥If you don't want the code to always check if the table exists, you can create the table manually in the Xata UI and pass createTable: false to the constructor. The table must have the following columns:

  • String 类型的 sessionId

    ¥sessionId of type String

  • String 类型的 type

    ¥type of type String

  • String 类型的 role

    ¥role of type String

  • Text 类型的 content

    ¥content of type Text

  • String 类型的 name

    ¥name of type String

  • Text 类型的 additionalKwargs

    ¥additionalKwargs of type Text

import { BufferMemory } from "langchain/memory";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";
import { XataChatMessageHistory } from "@langchain/community/stores/message/xata";
import { BaseClient } from "@xata.io/client";

// Before running this example, see the docs at
// https://js.langchain.com/docs/modules/memory/integrations/xata

// if you use the generated client, you don't need this function.
// Just import getXataClient from the generated xata.ts instead.
const getXataClient = () => {
if (!process.env.XATA_API_KEY) {
throw new Error("XATA_API_KEY not set");
}

if (!process.env.XATA_DB_URL) {
throw new Error("XATA_DB_URL not set");
}
const xata = new BaseClient({
databaseURL: process.env.XATA_DB_URL,
apiKey: process.env.XATA_API_KEY,
branch: process.env.XATA_BRANCH || "main",
});
return xata;
};

const memory = new BufferMemory({
chatHistory: new XataChatMessageHistory({
table: "messages",
sessionId: new Date().toISOString(), // Or some other unique identifier for the conversation
client: getXataClient(),
createTable: false, // Explicitly set to false if the table is already created
}),
});

const model = new ChatOpenAI();
const chain = new ConversationChain({ llm: model, memory });

const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
}
}
*/

const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });

/*
{
res1: {
text: "You said your name was Jim."
}
}
*/

API Reference: