HuggingFaceInference
以下是将 HugggingFaceInference 模型作为 LLM 调用的示例:
¥Here's an example of calling a HugggingFaceInference model as an LLM:
- npm
- Yarn
- pnpm
npm install @langchain/community @langchain/core @huggingface/inference@4
yarn add @langchain/community @langchain/core @huggingface/inference@4
pnpm add @langchain/community @langchain/core @huggingface/inference@4
tip
我们正在统一所有软件包的模型参数。我们现在建议使用 model
代替 modelName
,并使用 apiKey
作为 API 密钥。
¥We're unifying model params across all packages. We now suggest using model
instead of modelName
, and apiKey
for API keys.
import { HuggingFaceInference } from "@langchain/community/llms/hf";
const model = new HuggingFaceInference({
model: "gpt2",
apiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.HUGGINGFACEHUB_API_KEY
});
const res = await model.invoke("1 + 1 =");
console.log({ res });
相关
¥Related
大语言模型 概念指南
¥LLM conceptual guide
大语言模型 操作指南
¥LLM how-to guides