Semantic Kernel:Kernel内核

  目前SK已支持OpenAI,Azure OpenAI,Gemini,HuggingFace,MistralAI等LLM,相信之后会越来越丰富。

  首先要引入所对应的LLM包,具体项目文件如下:

复制代码
<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <OutputType>Exe</OutputType>
    <TargetFramework>net9.0</TargetFramework>
    <RootNamespace>Demo01_Kernel</RootNamespace>
    <ImplicitUsings>enable</ImplicitUsings>
    <Nullable>enable</Nullable>
  </PropertyGroup>
  <ItemGroup>
    <PackageReference Include="Microsoft.Extensions.Logging" Version="9.0.0-preview.4.24266.19" />
    <PackageReference Include="Microsoft.Extensions.Logging.Console" Version="9.0.0-preview.4.24266.19" />
    <PackageReference Include="Microsoft.SemanticKernel" Version="1.14.1" />
    <PackageReference Include="Microsoft.SemanticKernel.Connectors.Google" Version="1.14.1-alpha" />
    <PackageReference Include="Microsoft.SemanticKernel.Connectors.HuggingFace" Version="1.14.1-preview" />
    <PackageReference Include="Microsoft.SemanticKernel.Connectors.MistralAI" Version="1.14.1-alpha" />
  </ItemGroup>
</Project>
复制代码

  SK作为装LLM的SDK,主要是通过Kernel展开的,这点和asp.net core类似,首先通过Kernel来创建一个Builder,然后给Bulder的Services添加各种服务,最后Build获取到Kernel,然后执行使用各种功能,实现操作。

复制代码
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using Microsoft.Extensions.Logging.Console;
using Microsoft.Extensions.Options;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.OpenAI;
using Microsoft.SemanticKernel.Services;
using System.Diagnostics.CodeAnalysis;
using System.Net.NetworkInformation;

var chatModelId = "gpt-4o";
var key = File.ReadAllText(@"C:\GPT\key.txt");
var endpoint = "";
#pragma warning disable SKEXP0010
#pragma warning disable SKEXP0070
var builder = Kernel.CreateBuilder()
    .AddOpenAIChatCompletion(chatModelId, key);
//.AddAzureOpenAIChatCompletion(chatModelId, endpoint, key) 
//.AddGoogleAIGeminiChatCompletion(chatModelId, key) 
//.AddHuggingFaceChatCompletion(chatModelId, apiKey: key) 
//.AddMistralChatCompletion(chatModelId, key) 
//builder.Services.AddSingleton<IAIServiceSelector>(new GptAIServiceSelector());
//添加ServiceSelector
builder.Services.AddScoped<IAIServiceSelector, MyAIServiceSelector >();
//添回复自定义服务
builder.Services.AddScoped<IMyService, MyService>();
//添加日志服务
builder.Services.AddLogging(c => c
.AddConsole()
//.AddJsonConsole()
.SetMinimumLevel(LogLevel.Information));
Kernel kernel = builder.Build();

var logger = kernel.LoggerFactory.CreateLogger("logger");
var prompt = "你好,你能帮我做什么";
var result = await kernel.InvokePromptAsync(prompt);
var message = @$"返回信息:
{result.GetValue<string>()}";
logger.LogInformation(message);


class MyAIServiceSelector : IAIServiceSelector
{
    private readonly IMyService _myService;
    private readonly ILogger<GptAIServiceSelector> _logger;
    public GptAIServiceSelector(IMyService myService, ILogger<GptAIServiceSelector> logger)
    {
        _myService = myService;
        _logger = logger;
    }
    public bool TrySelectAIService<T>(
        Kernel kernel, KernelFunction function, KernelArguments arguments,
        [NotNullWhen(true)] out T? service, out PromptExecutionSettings? serviceSettings) where T : class, IAIService
{
        _myService.Print();
        foreach (var serviceToCheck in kernel.GetAllServices<T>())
        {
            var serviceModelId = serviceToCheck.GetModelId();
            var endpoint = serviceToCheck.GetEndpoint();
            if (!string.IsNullOrEmpty(serviceModelId))
            {
                _logger.LogInformation($"使用的模型: {serviceModelId} {endpoint}");
                _logger.LogInformation($"服务类型: {serviceToCheck.GetType().Name}");
                service = serviceToCheck;
                serviceSettings = new OpenAIPromptExecutionSettings();
                return true;
            }
        }
        service = null;
        serviceSettings = null;
        return false;
    }
}
//添加自定义服务
interface IMyService
{
    void Print();
}
class MyService : IMyService
{
    readonly ILogger<MyService> _logger;
    public MyService(ILogger<MyService> logger)
    {
        _logger = logger;
        logger.LogInformation("MyService实例化");
    }
    public void Print()
    {
        _logger.LogWarning("开始报警");
    }
}
复制代码

  文章来源微信公众号

  想要更快更方便的了解相关知识,可以关注微信公众号 

posted @   刘靖凯  阅读(4)  评论(0编辑  收藏  举报
相关博文:
阅读排行:
· 阿里最新开源QwQ-32B,效果媲美deepseek-r1满血版,部署成本又又又降低了!
· 开源Multi-agent AI智能体框架aevatar.ai,欢迎大家贡献代码
· Manus重磅发布:全球首款通用AI代理技术深度解析与实战指南
· 被坑几百块钱后,我竟然真的恢复了删除的微信聊天记录!
· 没有Manus邀请码?试试免邀请码的MGX或者开源的OpenManus吧
点击右上角即可分享
微信分享提示