Few Shot 提示模板
¥Few Shot Prompt Templates
少量样本提示是一种提示技术,它向大型语言模型 (LLM) 提供示例列表,然后要求 LLM 根据提供的示例生成一些文本。
¥Few shot prompting is a prompting technique which provides the Large Language Model (LLM) with a list of examples, and then asks the LLM to generate some text following the lead of the examples provided.
以下是一个例子:
¥An example of this is the following:
假设你希望 LLM 以特定格式响应。你可以向 LLM 提示一个问答对列表,以便它知道以何种格式进行响应。
¥Say you want your LLM to respond in a specific format. You can few shot prompt the LLM with a list of question answer pairs so it knows what format to respond in.
Respond to the users question in the with the following format:
Question: What is your name?
Answer: My name is John.
Question: What is your age?
Answer: I am 25 years old.
Question: What is your favorite color?
Answer:
这里我们保留最后一个 Answer:
为未定义状态,以便 LLM 可以填充它。LLM 将生成以下内容:
¥Here we left the last Answer:
undefined so the LLM can fill it in. The LLM will then generate the following:
Answer: I don't have a favorite color; I don't have preferences.
用例
¥Use Case
在下面的示例中,我们尝试使用 LLM 将问题改写为更通用的查询。
¥In the following example we're few shotting the LLM to rephrase questions into more general queries.
我们提供了两组示例,其中包含具体问题和重新表述的一般问题。FewShotChatMessagePromptTemplate
将使用我们的示例,当调用 .format
时,我们将看到这些示例被格式化为字符串,并传递给 LLM。
¥We provide two sets of examples with specific questions, and rephrased general questions. The FewShotChatMessagePromptTemplate
will use our examples and when .format
is called, we'll see those examples formatted into a string we can pass to the LLM.
import {
ChatPromptTemplate,
FewShotChatMessagePromptTemplate,
} from "langchain/prompts";
const examples = [
{
input: "Could the members of The Police perform lawful arrests?",
output: "what can the members of The Police do?",
},
{
input: "Jan Sindel's was born in what country?",
output: "what is Jan Sindel's personal history?",
},
];
const examplePrompt = ChatPromptTemplate.fromTemplate(`Human: {input}
AI: {output}`);
const fewShotPrompt = new FewShotChatMessagePromptTemplate({
examplePrompt,
examples,
inputVariables: [], // no input variables
});
const formattedPrompt = await fewShotPrompt.format({});
console.log(formattedPrompt);
[
HumanMessage {
lc_namespace: [ 'langchain', 'schema' ],
content: 'Human: Could the members of The Police perform lawful arrests?\n' +
'AI: what can the members of The Police do?',
additional_kwargs: {}
},
HumanMessage {
lc_namespace: [ 'langchain', 'schema' ],
content: "Human: Jan Sindel's was born in what country?\n" +
"AI: what is Jan Sindel's personal history?",
additional_kwargs: {}
}
]
然后,如果我们将此密钥用于其他问题,LLM 将按照我们的意愿重新表述该问题。
¥Then, if we use this with another question, the LLM will rephrase the question how we want.
- npm
- Yarn
- pnpm
npm install @langchain/openai @langchain/core
yarn add @langchain/openai @langchain/core
pnpm add @langchain/openai @langchain/core
import { ChatOpenAI } from "@langchain/openai";
const model = new ChatOpenAI({});
const examples = [
{
input: "Could the members of The Police perform lawful arrests?",
output: "what can the members of The Police do?",
},
{
input: "Jan Sindel's was born in what country?",
output: "what is Jan Sindel's personal history?",
},
];
const examplePrompt = ChatPromptTemplate.fromTemplate(`Human: {input}
AI: {output}`);
const fewShotPrompt = new FewShotChatMessagePromptTemplate({
prefix:
"Rephrase the users query to be more general, using the following examples",
suffix: "Human: {input}",
examplePrompt,
examples,
inputVariables: ["input"],
});
const formattedPrompt = await fewShotPrompt.format({
input: "What's France's main city?",
});
const response = await model.invoke(formattedPrompt);
console.log(response);
AIMessage {
lc_namespace: [ 'langchain', 'schema' ],
content: 'What is the capital of France?',
additional_kwargs: { function_call: undefined }
}
使用函数的快照
¥Few Shotting With Functions
你还可以使用函数进行部分调用。这种用法的用例是,当你有一个变量,并且你知道你总是希望以一种通用的方式获取它时。一个典型的例子是日期或时间。想象一下,你有一个提示符,你始终希望它显示当前日期。你无法在提示符中对其进行硬编码,并且将其与其他输入变量一起传递可能会很繁琐。在本例中,使用始终返回当前日期的函数来部分处理提示符非常方便。
¥You can also partial with a function. The use case for this is when you have a variable you know that you always want to fetch in a common way. A prime example of this is with date or time. Imagine you have a prompt which you always want to have the current date. You can't hard code it in the prompt, and passing it along with the other input variables can be tedious. In this case, it's very handy to be able to partial the prompt with a function that always returns the current date.
const getCurrentDate = () => {
return new Date().toISOString();
};
const prompt = new FewShotChatMessagePromptTemplate({
template: "Tell me a {adjective} joke about the day {date}",
inputVariables: ["adjective", "date"],
});
const partialPrompt = await prompt.partial({
date: getCurrentDate,
});
const formattedPrompt = await partialPrompt.format({
adjective: "funny",
});
console.log(formattedPrompt);
// Tell me a funny joke about the day 2023-07-13T00:54:59.287Z
Few Shot 与 Chat Few 快照
¥Few Shot vs Chat Few Shot
聊天和非聊天的简短提示模板的作用方式类似。以下示例将演示如何使用聊天和非聊天,以及它们输出的区别。
¥The chat and non chat few shot prompt templates act in a similar way. The below example will demonstrate using chat and non chat, and the differences with their outputs.
import {
FewShotPromptTemplate,
FewShotChatMessagePromptTemplate,
} from "langchain/prompts";
const examples = [
{
input: "Could the members of The Police perform lawful arrests?",
output: "what can the members of The Police do?",
},
{
input: "Jan Sindel's was born in what country?",
output: "what is Jan Sindel's personal history?",
},
];
const prompt = `Human: {input}
AI: {output}`;
const examplePromptTemplate = PromptTemplate.fromTemplate(prompt);
const exampleChatPromptTemplate = ChatPromptTemplate.fromTemplate(prompt);
const chatFewShotPrompt = new FewShotChatMessagePromptTemplate({
examplePrompt: exampleChatPromptTemplate,
examples,
inputVariables: [], // no input variables
});
const fewShotPrompt = new FewShotPromptTemplate({
examplePrompt: examplePromptTemplate,
examples,
inputVariables: [], // no input variables
});
console.log("Chat Few Shot: ", await chatFewShotPrompt.formatMessages({}));
/**
Chat Few Shot: [
HumanMessage {
lc_namespace: [ 'langchain', 'schema' ],
content: 'Human: Could the members of The Police perform lawful arrests?\n' +
'AI: what can the members of The Police do?',
additional_kwargs: {}
},
HumanMessage {
lc_namespace: [ 'langchain', 'schema' ],
content: "Human: Jan Sindel's was born in what country?\n" +
"AI: what is Jan Sindel's personal history?",
additional_kwargs: {}
}
]
*/
console.log("Few Shot: ", await fewShotPrompt.formatPromptValue({}));
/**
Few Shot:
Human: Could the members of The Police perform lawful arrests?
AI: what can the members of The Police do?
Human: Jan Sindel's was born in what country?
AI: what is Jan Sindel's personal history?
*/
这里我们可以看到 FewShotChatMessagePromptTemplate
和 FewShotPromptTemplate
之间的主要区别:输入和输出值。
¥Here we can see the main distinctions between FewShotChatMessagePromptTemplate
and FewShotPromptTemplate
: input and output values.
FewShotChatMessagePromptTemplate
的工作原理是接收 ChatPromptTemplate
示例列表,其输出是 BaseMessage
实例列表。
¥FewShotChatMessagePromptTemplate
works by taking in a list of ChatPromptTemplate
for examples, and its output is a list of instances of BaseMessage
.
另一方面,FewShotPromptTemplate
的工作原理是接收 PromptTemplate
作为示例,其输出是一个字符串。
¥On the other hand, FewShotPromptTemplate
works by taking in a PromptTemplate
for examples, and its output is a string.
包含非聊天模型
¥With Non Chat Models
LangChain 还提供了一个用于非聊天模型的少量提示格式的类:FewShotPromptTemplate
。API 大致相同,但输出格式不同(聊天消息 vs 字符串)。
¥LangChain also provides a class for few shot prompt formatting for non chat models: FewShotPromptTemplate
. The API is largely the same, but the output is formatted differently (chat messages vs strings).
带函数的部分
¥Partials With Functions
import {
ChatPromptTemplate,
FewShotChatMessagePromptTemplate,
} from "langchain/prompts";
const examplePrompt = PromptTemplate.fromTemplate("{foo}{bar}");
const prompt = new FewShotPromptTemplate({
prefix: "{foo}{bar}",
examplePrompt,
inputVariables: ["foo", "bar"],
});
const partialPrompt = await prompt.partial({
foo: () => Promise.resolve("boo"),
});
const formatted = await partialPrompt.format({ bar: "baz" });
console.log(formatted);
boobaz\n
包含函数和示例选择器
¥With Functions and Example Selector
import {
ChatPromptTemplate,
FewShotChatMessagePromptTemplate,
} from "langchain/prompts";
const examplePrompt = PromptTemplate.fromTemplate("An example about {x}");
const exampleSelector = await LengthBasedExampleSelector.fromExamples(
[{ x: "foo" }, { x: "bar" }],
{ examplePrompt, maxLength: 200 }
);
const prompt = new FewShotPromptTemplate({
prefix: "{foo}{bar}",
exampleSelector,
examplePrompt,
inputVariables: ["foo", "bar"],
});
const partialPrompt = await prompt.partial({
foo: () => Promise.resolve("boo"),
});
const formatted = await partialPrompt.format({ bar: "baz" });
console.log(formatted);
boobaz
An example about foo
An example about bar