Skip to main content

HuggingFace 推断

¥HuggingFace Inference

此 Embeddings 集成使用 HuggingFace Inference API 为给定文本生成嵌入,默认使用 BAAI/bge-base-en-v1.5 模型。你可以将不同的模型名称传递给构造函数以使用不同的模型。

¥This Embeddings integration uses the HuggingFace Inference API to generate embeddings for a given text, using the BAAI/bge-base-en-v1.5 model by default. You can pass a different model name to the constructor to use a different model.

设置

¥Setup

你首先需要安装 @langchain/community 软件包和所需的对等依赖:

¥You'll first need to install the @langchain/community package and the required peer dependency:

npm install @langchain/community @langchain/core @huggingface/inference@4

用法

¥Usage

import { HuggingFaceInferenceEmbeddings } from "@langchain/community/embeddings/hf";

const embeddings = new HuggingFaceInferenceEmbeddings({
apiKey: "YOUR-API-KEY", // Defaults to process.env.HUGGINGFACEHUB_API_KEY
model: "MODEL-NAME", // Defaults to `BAAI/bge-base-en-v1.5` if not provided
provider: "MODEL-PROVIDER", // Falls back to auto selection mechanism within Hugging Face's inference API if not provided
});

注意:如果你未提供 model,则会记录一条警告并使用默认模型 BAAI/bge-base-en-v1.5。如果你未提供 provider,Hugging Face 将默认选择 auto,它将根据你在 https://hf.co/settings/inference-providers 中的设置选择第一个可用的模型提供商。

¥Note:\ If you do not provide a model, a warning will be logged and the default model BAAI/bge-base-en-v1.5 will be used. If you do not provide a provider, Hugging Face will default to auto selection, which will select the first provider available for the model based on your settings at https://hf.co/settings/inference-providers.

提示:hf-inference 是由 Hugging Face 直接托管的模型的提供程序名称。

¥Hint:\ hf-inference is the provider name for models that are hosted directly by Hugging Face.

¥Related