Skip to main content

DeepInfra

LangChain 通过 DeepInfra 封装器支持由 深度基础设施 托管的 LLM。首先,你需要安装 @langchain/community 软件包:

¥LangChain supports LLMs hosted by Deep Infra through the DeepInfra wrapper. First, you'll need to install the @langchain/community package:

npm install @langchain/community @langchain/core

你需要获取 API 密钥并将其设置为名为 DEEPINFRA_API_TOKEN 的环境变量(或将其传递给构造函数),然后按如下所示调用模型:

¥You'll need to obtain an API key and set it as an environment variable named DEEPINFRA_API_TOKEN (or pass it into the constructor), then call the model as shown below:

import { DeepInfraLLM } from "@langchain/community/llms/deepinfra";

const apiKey = process.env.DEEPINFRA_API_TOKEN;
const model = "meta-llama/Meta-Llama-3-70B-Instruct";

const llm = new DeepInfraLLM({
temperature: 0.7,
maxTokens: 20,
model,
apiKey,
maxRetries: 5,
});

const res = await llm.invoke(
"What is the next step in the process of making a good game?"
);

console.log({ res });

API Reference:

¥Related