友情链接
¥Friendli
友情链接 通过可扩展、高效的部署选项,提升 AI 应用性能并优化成本节约,专为高需求 AI 工作负载量身定制。
¥Friendli enhances AI application performance and optimizes cost savings with scalable, efficient deployment options, tailored for high-demand AI workloads.
本教程将指导你将 Friendli
与 LangChain 集成。
¥This tutorial guides you through integrating Friendli
with LangChain.
设置
¥Setup
确保已安装 @langchain/community
。
¥Ensure the @langchain/community
is installed.
- npm
- Yarn
- pnpm
npm install @langchain/community @langchain/core
yarn add @langchain/community @langchain/core
pnpm add @langchain/community @langchain/core
登录 Friendli Suite 创建个人访问令牌,并将其设置为 FRIENDLI_TOKEN
环境。你可以将团队 ID 设置为 FRIENDLI_TEAM
环境。
¥Sign in to Friendli Suite to create a Personal Access Token, and set it as the FRIENDLI_TOKEN
environment.
You can set team id as FRIENDLI_TEAM
environment.
你可以通过选择要使用的模型来初始化 Friendli 聊天模型。默认模型是 mixtral-8x7b-instruct-v0-1
。你可以在 docs.friendli.ai 查看可用的模型。
¥You can initialize a Friendli chat model with selecting the model you want to use. The default model is mixtral-8x7b-instruct-v0-1
. You can check the available models at docs.friendli.ai.
用法
¥Usage
import { Friendli } from "@langchain/community/llms/friendli";
const model = new Friendli({
model: "mixtral-8x7b-instruct-v0-1", // Default value
friendliToken: process.env.FRIENDLI_TOKEN,
friendliTeam: process.env.FRIENDLI_TEAM,
maxTokens: 18,
temperature: 0.75,
topP: 0.25,
frequencyPenalty: 0,
stop: [],
});
const response = await model.invoke(
"Check the Grammar: She dont like to eat vegetables, but she loves fruits."
);
console.log(response);
/*
Correct: She doesn't like to eat vegetables, but she loves fruits
*/
const stream = await model.stream(
"Check the Grammar: She dont like to eat vegetables, but she loves fruits."
);
for await (const chunk of stream) {
console.log(chunk);
}
/*
Cor
rect
:
She
doesn
...
she
loves
fruits
*/
API Reference:
- Friendli from
@langchain/community/llms/friendli
相关
¥Related
大语言模型 概念指南
¥LLM conceptual guide
大语言模型 操作指南
¥LLM how-to guides