Skip to main content

AzionRetriever

概述

¥Overview

这将帮助你开始使用 AzionRetriever。有关所有 AzionRetriever 功能和配置的详细文档,请前往 API 参考

¥This will help you getting started with the AzionRetriever. For detailed documentation of all AzionRetriever features and configurations head to the API reference.

集成详情

¥Integration details

Retriever Self-host Cloud offering Package [Py support]
AzionRetriever @langchain/community

设置

¥Setup

要使用 AzionRetriever,你需要设置 AZION_TOKEN 环境变量。

¥To use the AzionRetriever, you need to set the AZION_TOKEN environment variable.

process.env.AZION_TOKEN = "your-api-key";

如果你在本指南中使用 OpenAI 嵌入,则还需要设置你的 OpenAI 密钥:

¥If you are using OpenAI embeddings for this guide, you’ll need to set your OpenAI key as well:

process.env.OPENAI_API_KEY = "YOUR_API_KEY";

如果你想自动追踪单个查询,也可以通过取消注释以下内容来设置你的 LangSmith API 密钥:

¥If you want to get automated tracing from individual queries, you can also set your LangSmith API key by uncommenting below:

// process.env.LANGSMITH_API_KEY = "<YOUR API KEY HERE>";
// process.env.LANGSMITH_TRACING = "true";

安装

¥Installation

此检索器位于 @langchain/community/retrievers/azion_edgesql 包中:

¥This retriever lives in the @langchain/community/retrievers/azion_edgesql package:

yarn add azion @langchain/openai @langchain/community

实例化

¥Instantiation

现在我们可以实例化我们的检索器:

¥Now we can instantiate our retriever:

import { AzionRetriever } from "@langchain/community/retrievers/azion_edgesql";
import { OpenAIEmbeddings } from "@langchain/openai";
import { ChatOpenAI } from "@langchain/openai";

const embeddingModel = new OpenAIEmbeddings({
model: "text-embedding-3-small",
});

const chatModel = new ChatOpenAI({
model: "gpt-4o-mini",
apiKey: process.env.OPENAI_API_KEY,
});

const retriever = new AzionRetriever(embeddingModel, {
dbName: "langchain",
vectorTable: "documents", // table where the vector embeddings are stored
ftsTable: "documents_fts", // table where the fts index is stored
searchType: "hybrid", // search type to use for the retriever
ftsK: 2, // number of results to return from the fts index
similarityK: 2, // number of results to return from the vector index
metadataItems: ["language", "topic"],
filters: [{ operator: "=", column: "language", value: "en" }],
entityExtractor: chatModel,
}); // number of results to return from the vector index

用法

¥Usage

const query = "Australia";

await retriever.invoke(query);
[
Document {
pageContent: 'Australia s indigenous people have inhabited the continent for over 65,000 years',
metadata: { language: 'en', topic: 'history', searchtype: 'similarity' },
id: '3'
},
Document {
pageContent: 'Australia is a leader in solar energy adoption and renewable technology',
metadata: { language: 'en', topic: 'technology', searchtype: 'similarity' },
id: '5'
},
Document {
pageContent: 'Australia s tech sector is rapidly growing with innovation hubs in major cities',
metadata: { language: 'en', topic: 'technology', searchtype: 'fts' },
id: '7'
}
]

在链中使用

¥Use within a chain

与其他检索器一样,AzionRetriever 可以通过 chains 集成到 LLM 应用中。

¥Like other retrievers, AzionRetriever can be incorporated into LLM applications via chains.

我们需要一个 LLM 或聊天模型:

¥We will need a LLM or chat model:

Pick your chat model:

Install dependencies

yarn add @langchain/groq 

Add environment variables

GROQ_API_KEY=your-api-key

Instantiate the model

import { ChatGroq } from "@langchain/groq";

const llm = new ChatGroq({
model: "llama-3.3-70b-versatile",
temperature: 0
});
import { ChatPromptTemplate } from "@langchain/core/prompts";
import {
RunnablePassthrough,
RunnableSequence,
} from "@langchain/core/runnables";
import { StringOutputParser } from "@langchain/core/output_parsers";

import type { Document } from "@langchain/core/documents";

const prompt = ChatPromptTemplate.fromTemplate(`
Answer the question based only on the context provided.

Context: {context}

Question: {question}`);

const formatDocs = (docs: Document[]) => {
return docs.map((doc) => doc.pageContent).join("\n\n");
};

// See https://langchain.nodejs.cn/docs/tutorials/rag
const ragChain = RunnableSequence.from([
{
context: retriever.pipe(formatDocs),
question: new RunnablePassthrough(),
},
prompt,
llm,
new StringOutputParser(),
]);
await ragChain.invoke("Paris");
The context mentions that the 2024 Olympics are in Paris.

API 参考

¥API reference

有关所有 AzionRetriever 功能和配置的详细文档,请前往 API 参考

¥For detailed documentation of all AzionRetriever features and configurations head to the API reference.