Semantic Kernel 入门系列:🥑Memory内存

image

了解的运作原理之后,就可以开始使用Semantic Kernel来制作应用了。

Semantic Kernel将embedding的功能封装到了Memory中,用来存储上下文信息,就好像电脑的内存一样,而LLM就像是CPU一样,我们所需要做的就是从内存中取出相关的信息交给CPU处理就好了。

Memory配置

首先需要安装Microsoft.SemanticKernel.Plugins.Memory NuGet包,Memory的功能作为Plugin提供。

使用Memory需要注册 embedding模型,目前常用的就是 text-embedding-ada-002。同时需要添加MemoryStore,用于存储更多的信息,这里Semantic Kernel提供了一个 VolatileMemoryStore,就是一个普通的内存存储的MemoryStore,可以用作测试,实际生产使用的话,建议采用其他的向量数据库的存储。

var memory = new MemoryBuilder()
                .WithAzureOpenAITextEmbeddingGeneration("text-embedding-ada-002",
                    Environment.GetEnvironmentVariable("AZURE_OPENAI_API_ENDPOINT"),
                    Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY"))
                .WithMemoryStore(new VolatileMemoryStore())
                .Build();

信息存储

完成了基础信息的注册后,就可以往Memory中存储信息了。

const string MemoryCollectionName = "aboutMe";

await memory.SaveInformationAsync(MemoryCollectionName, id: "info1", text: "My name is Andrea");
await memory.SaveInformationAsync(MemoryCollectionName, id: "info2", text: "I currently work as a tourist operator");
await memory.SaveInformationAsync(MemoryCollectionName, id: "info3", text: "I currently live in Seattle and have been living there since 2005");
await memory.SaveInformationAsync(MemoryCollectionName, id: "info4", text: "I visited France and Italy five times since 2015");
await memory.SaveInformationAsync(MemoryCollectionName, id: "info5", text: "My family is from New York");

SaveInformationAsync 会将text的内容通过 embedding 模型转化为对应的文本向量,存放在的MemoryStore中。其中CollectionName如同数据库的表名,Id就是Id。

语义搜索

完成信息的存储之后,就可以用来语义搜索了。

直接使用Memory.SearchAsync方法,指定对应的Collection,同时提供相应的查询问题,查询问题也会被转化为embedding,再在MemoryStore中计算查找最相似的信息。


var questions = new[]
{
    "what is my name?",
    "where do I live?",
    "where is my family from?",
    "where have I travelled?",
    "what do I do for work?",
};

foreach (var q in questions)
{
    await foreach (var response in memory.SearchAsync(MemoryCollectionName, q))
    {
        Console.WriteLine(q + " " + response?.Metadata.Text);
    }
}

// output
/*
what is my name? My name is Andrea
where do I live? I currently live in Seattle and have been living there since 2005
where is my family from? My family is from New York
where have I travelled? I visited France and Italy five times since 2015
what do I do for work? I currently work as a tourist operator
*/

到这个时候,即便不需要进行总结归纳,光是这样的语义查找,都会很有价值。

引用存储

除了添加信息以外,还可以添加引用,像是非常有用的参考链接之类的。

const string memoryCollectionName = "SKGitHub";

var githubFiles = new Dictionary<string, string>()
{
	["https://github.com/microsoft/semantic-kernel/blob/main/README.md"]
		= "README: Installation, getting started, and how to contribute",
	["https://github.com/microsoft/semantic-kernel/blob/main/samples/notebooks/dotnet/2-running-prompts-from-file.ipynb"]
		= "Jupyter notebook describing how to pass prompts from a file to a semantic plugin or function",
	["https://github.com/microsoft/semantic-kernel/blob/main/samples/notebooks/dotnet/Getting-Started-Notebook.ipynb"]
		= "Jupyter notebook describing how to get started with the Semantic Kernel",
	["https://github.com/microsoft/semantic-kernel/tree/main/samples/plugins/Chatplugin/ChatGPT"]
		= "Sample demonstrating how to create a chat plugin interfacing with ChatGPT",
	["https://github.com/microsoft/semantic-kernel/blob/main/dotnet/src/SemanticKernel/Memory/Volatile/VolatileMemoryStore.cs"]
		= "C# class that defines a volatile embedding store",
	["https://github.com/microsoft/semantic-kernel/tree/main/samples/dotnet/KernelHttpServer/README.md"]
		= "README: How to set up a Semantic Kernel Service API using Azure Function Runtime v4",
	["https://github.com/microsoft/semantic-kernel/tree/main/samples/apps/chat-summary-webapp-react/README.md"]
		= "README: README associated with a sample starter react-based chat summary webapp",
};

Console.WriteLine("Adding some GitHub file URLs and their descriptions to a volatile Semantic Memory.");
foreach (var entry in githubFiles)
{
	await memory.SaveReferenceAsync(
		collection: memoryCollectionName,
		description: entry.Value,
		text: entry.Value,
		externalId: entry.Key,
		externalSourceName: "GitHub"
	);
}

同样的,使用SearchAsync搜索就行。

string ask = "I love Jupyter notebooks, how should I get started?";
Console.WriteLine("===========================\n" +
					"Query: " + ask + "\n");

var memories = memory.SearchAsync(memoryCollectionName, ask, limit: 5, minRelevanceScore: 0.77);
var i = 0;
await foreach (MemoryQueryResult result in memories)
{
	Console.WriteLine($"Result {++i}:");
	Console.WriteLine("  URL:     : " + result.Metadata.Id);
	Console.WriteLine("  Title    : " + result.Metadata.Description);
	Console.WriteLine("  ExternalSource: " + result.Metadata.ExternalSourceName);
	Console.WriteLine("  Relevance: " + result.Relevance);
	Console.WriteLine();
}
//output
/*
===========================
Query: I love Jupyter notebooks, how should I get started?

Result 1:
  URL:     : https://github.com/microsoft/semantic-kernel/blob/main/samples/notebooks/dotnet/Getting-Started-Notebook.ipynb
  Title    : Jupyter notebook describing how to get started with the Semantic Kernel
  ExternalSource: GitHub
  Relevance: 0.8677417635917664

Result 2:
  URL:     : https://github.com/microsoft/semantic-kernel/blob/main/samples/notebooks/dotnet/2-running-prompts-from-file.ipynb
  Title    : Jupyter notebook describing how to pass prompts from a file to a semantic plugin or function
  ExternalSource: GitHub
  Relevance: 0.8165638446807861

Result 3:
  URL:     : https://github.com/microsoft/semantic-kernel/blob/main/README.md
  Title    : README: Installation, getting started, and how to contribute
  ExternalSource: GitHub
  Relevance: 0.808631956577301
*/

这里多使用了两个参数,一个是limit,用于限制返回信息的条数,只返回最相似的前几条数据,另外一个是minRelevanceScore,限制最小的相关度分数,这个取值范围在0.0 ~ 1.0 之间,1.0意味着完全匹配。

语义问答

将Memory的存储、搜索功能和语义插件相结合,就可以快速的打造一个实用的语义问答的应用了。

只需要将搜索到的相关信息内容填充到 prompt中,然后将内容和问题都抛给LLM,就可以等着得到一个满意的答案了。


var prompt = 
"""
It can give explicit instructions or say 'I don't know' if it does not have an answer.

Information about me, from previous conversations:
{{ $fact }}

User: {{ $ask }}
ChatBot:
""";
var function = kernel.CreateFunctionFromPrompt(prompt);
var ask = "Hello, I think we've met before, remember? my name is...";
var fact = await memory.SearchAsync(MemoryCollectionName, ask).FirstOrDefaultAsync(); // .FirstOrDefaultAsync()  需要安装 System.Linq.Async
var args = new KernelArguments();
args["fact"] = fact?.Metadata?.Text;
args["ask"] = ask;

var resultContext = await function.InvokeAsync(kernel,args);
resultContext.GetValue<string>().Dump();


//output
/*
Hi there! Yes, I remember you. Your name is Andrea, right?
*/

优化搜索过程

由于这种场景太常见了,所以Semantic Kernel中直接提供了一个插件TextMemoryPlugin,通过Function调用的方式简化了搜索的过程。

// .. SaveInformations 

// TextMemoryPlugin provides the "recall" function
kernel.ImportPluginFromObject(new TextMemoryPlugin(memory));


var prompt =
"""
It can give explicit instructions or say 'I don't know' if it does not have an answer.

Information about me, from previous conversations:
{{ recall $ask }}

User: {{ $ask }}
ChatBot:
""";

var function = kernel.CreateFunctionFromPrompt(prompt);
var ask = "Hello, I think we've met before, remember? my name is...";

var args = new KernelArguments();
args["ask"] = ask;
args[TextMemoryPlugin.CollectionParam] = MemoryCollectionName;

var resultContext = await function.InvokeAsync(kernel, args);
resultContext.GetValue<string>().Dump();

// output
/*
Hi there! Yes, I remember you. Your name is Andrea, right?
*/

这里直接使用 recall 方法,将问题传给了 TextMemoryPlugin,搜索对应得到结果,免去了手动搜索注入得过程。

内存的持久化

VolatileMemoryStore本身也是易丢失的,往往使用到内存的场景,其中的信息都是有可能长期存储的,起码并不会即刻过期。那么将这些信息的 embedding 能够长期存储起来,也是比较划算的事情。毕竟每一次做 embedding的转化也是需要调接口,需要花钱的。

Semantic Kernel库中包含了SQLite、Qdrant和CosmosDB的实现,自行扩展的话,也只需要实现 IMemoryStore 这个接口就可以了。


参考资料:

  1. What are Memories?
  2. Building Semantic Memory with Embeddings
  3. https://github.com/johnmaeda/SK-Recipes/blob/main/e4-memories/notebook.ipynb
  4. What is a vector database?
posted @ 2023-04-13 23:08  宵伯特  阅读(2094)  评论(0编辑  收藏  举报