Skip to main content

Gradient AI

LangChain.js 支持与 Gradient AI 集成。查看 Gradient AI 获取可用模型列表。

¥LangChain.js supports integration with Gradient AI. Check out Gradient AI for a list of available models.

设置

¥Setup

你需要安装官方 Gradient Node SDK 作为对等依赖:

¥You'll need to install the official Gradient Node SDK as a peer dependency:

npm i @gradientai/nodejs-sdk

你需要设置以下环境变量才能使用 Gradient AI API。

¥You will need to set the following environment variables for using the Gradient AI API.

  1. GRADIENT_ACCESS_TOKEN
  2. GRADIENT_WORKSPACE_ID

或者,可以在 GradientAI 类实例化期间分别将它们设置为 gradientAccessKeyworkspaceId。例如:

¥Alternatively, these can be set during the GradientAI Class instantiation as gradientAccessKey and workspaceId respectively. For example:

const model = new GradientLLM({
gradientAccessKey: "My secret Access Token"
workspaceId: "My secret workspace id"
});

用法

¥Usage

npm install @langchain/community @langchain/core

使用 Gradient 的基础模型

¥Using Gradient's Base Models

import { GradientLLM } from "@langchain/community/llms/gradient_ai";

// Note that inferenceParameters are optional
const model = new GradientLLM({
modelSlug: "llama2-7b-chat",
inferenceParameters: {
maxGeneratedTokenCount: 20,
temperature: 0,
},
});
const res = await model.invoke(
"What would be a good company name for a company that makes colorful socks?"
);

console.log({ res });

API Reference:

使用自定义微调适配器

¥Using your own fine-tuned Adapters

要使用你自己的自定义适配器,只需在设置期间设置 adapterId 即可。

¥The use your own custom adapter simply set adapterId during setup.

import { GradientLLM } from "@langchain/community/llms/gradient_ai";

// Note that inferenceParameters are optional
const model = new GradientLLM({
adapterId: process.env.GRADIENT_ADAPTER_ID,
inferenceParameters: {
maxGeneratedTokenCount: 20,
temperature: 0,
},
});
const res = await model.invoke(
"What would be a good company name for a company that makes colorful socks?"
);

console.log({ res });

API Reference:

¥Related