ChatDeepInfra
LangChain 通过 ChatDeepInfra
封装器支持由 深度基础设施 托管的聊天模型。首先,你需要安装 @langchain/community
软件包:
¥LangChain supports chat models hosted by Deep Infra through the ChatDeepInfra
wrapper.
First, you'll need to install the @langchain/community
package:
tip
- npm
- Yarn
- pnpm
npm install @langchain/community @langchain/core
yarn add @langchain/community @langchain/core
pnpm add @langchain/community @langchain/core
你需要获取 API 密钥并将其设置为名为 DEEPINFRA_API_TOKEN
的环境变量(或将其传递给构造函数),然后按如下所示调用模型:
¥You'll need to obtain an API key and set it as an environment variable named DEEPINFRA_API_TOKEN
(or pass it into the constructor), then call the model as shown below:
import { ChatDeepInfra } from "@langchain/community/chat_models/deepinfra";
import { HumanMessage } from "@langchain/core/messages";
const apiKey = process.env.DEEPINFRA_API_TOKEN;
const model = "meta-llama/Meta-Llama-3-70B-Instruct";
const chat = new ChatDeepInfra({
model,
apiKey,
});
const messages = [new HumanMessage("Hello")];
const res = await chat.invoke(messages);
console.log(res);
API Reference:
- ChatDeepInfra from
@langchain/community/chat_models/deepinfra
- HumanMessage from
@langchain/core/messages
相关
¥Related
聊天模型 概念指南
¥Chat model conceptual guide
聊天模型 操作指南
¥Chat model how-to guides