Skip to main content

Redis 支持的聊天内存

¥Redis-Backed Chat Memory

为了在聊天会话中实现更长期的持久化,你可以将支持聊天内存类(如 BufferMemory)的默认内存 chatHistory 替换为 Redis 实例。

¥For longer-term persistence across chat sessions, you can swap out the default in-memory chatHistory that backs chat memory classes like BufferMemory for a Redis instance.

设置

¥Setup

你需要在你的项目中安装 node-redis

¥You will need to install node-redis in your project:

npm install @langchain/openai @langchain/community @langchain/core redis

你还需要一个 Redis 实例来连接。请参阅 Redis 官方网站 上的说明,了解如何在本地运行服务器。

¥You will also need a Redis instance to connect to. See instructions on the official Redis website for running the server locally.

用法

¥Usage

存储在 Redis 中的每个聊天历史记录会话都必须具有唯一的 ID。你可以提供可选的 sessionTTL,使会话在指定秒数后过期。config 参数直接传递给 node-rediscreateClient 方法,并接受所有相同的参数。

¥Each chat history session stored in Redis must have a unique id. You can provide an optional sessionTTL to make sessions expire after a give number of seconds. The config parameter is passed directly into the createClient method of node-redis, and takes all the same arguments.

import { BufferMemory } from "langchain/memory";
import { RedisChatMessageHistory } from "@langchain/redis";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";

const memory = new BufferMemory({
chatHistory: new RedisChatMessageHistory({
sessionId: new Date().toISOString(), // Or some other unique identifier for the conversation
sessionTTL: 300, // 5 minutes, omit this parameter to make sessions never expire
}),
});

const model = new ChatOpenAI({
model: "gpt-3.5-turbo",
temperature: 0,
});

const chain = new ConversationChain({ llm: model, memory });

const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
}
}
*/

const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });

/*
{
res1: {
text: "You said your name was Jim."
}
}
*/

API Reference:

高级用法

¥Advanced Usage

你还可以直接传入之前创建的 node-redis 客户端实例:

¥You can also directly pass in a previously created node-redis client instance:

import { Redis } from "ioredis";
import { BufferMemory } from "langchain/memory";
import { RedisChatMessageHistory } from "@langchain/community/stores/message/ioredis";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";

const client = new Redis("redis://localhost:6379");

const memory = new BufferMemory({
chatHistory: new RedisChatMessageHistory({
sessionId: new Date().toISOString(),
sessionTTL: 300,
client,
}),
});

const model = new ChatOpenAI({
model: "gpt-3.5-turbo",
temperature: 0,
});

const chain = new ConversationChain({ llm: model, memory });

const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
}
}
*/

const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });

/*
{
res1: {
text: "You said your name was Jim."
}
}
*/

API Reference:

Redis Sentinel 支持

¥Redis Sentinel Support

你可以使用 ioredis 启用 Redis Sentinel 支持的缓存。

¥You can enable a Redis Sentinel backed cache using ioredis

这需要在你的项目中安装 ioredis

¥This will require the installation of ioredis in your project.

npm install ioredis
import { Redis } from "ioredis";
import { BufferMemory } from "langchain/memory";
import { RedisChatMessageHistory } from "@langchain/community/stores/message/ioredis";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";

// Uses ioredis to facilitate Sentinel Connections see their docs for details on setting up more complex Sentinels: https://github.com/redis/ioredis#sentinel
const client = new Redis({
sentinels: [
{ host: "localhost", port: 26379 },
{ host: "localhost", port: 26380 },
],
name: "mymaster",
});

const memory = new BufferMemory({
chatHistory: new RedisChatMessageHistory({
sessionId: new Date().toISOString(),
sessionTTL: 300,
client,
}),
});

const model = new ChatOpenAI({ temperature: 0.5 });

const chain = new ConversationChain({ llm: model, memory });

const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
}
}
*/

const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });

/*
{
res1: {
text: "You said your name was Jim."
}
}
*/

API Reference: