AI-接入

前言

前面已经申请了模型,并且通过测试已经可以访问使用了,本篇的接入还是使用Ollama,前面我们已经可以在命令行终端能够进行交互了,现在将AI接入到代码中;

准备

作为一名Neter这里使用的是.net,首先是创建项目,这里使用的是WebApi项目,也可以使用控制台;

使用SemanticKernel接入AI,SemanticKernel是一个帮助程序连接AI模型的工具,以下是官方的介绍:

Semantic Kernel is a lightweight, open-source development kit that lets you easily build AI agents and integrate the latest AI models into your C#, Python, or Java codebase. It serves as an efficient middleware that enables rapid delivery of enterprise-grade solutions. 

引入SemanticKernel包

dotnet add package Microsoft.SemanticKernel
dotnet add package Microsoft.SemanticKernel.Connectors.Ollama

ollama connector目前是alpha版本,Nuget中搜索需要勾选包括预发行版

Ollama接入示例

注册

Program.cs

using Microsoft.SemanticKernel.ChatCompletion;
using Microsoft.SemanticKernel;
using OllamaSharp.Models;
using OllamaSharp;

var endpoint = new Uri("http://localhost:11434");
var modelId = "llama3:latest";
builder.Services.AddSingleton(new OllamaApiClient(endpoint, modelId));

创建接口

[Route("api/[controller]")]
[ApiController]
public class AIChatController : ControllerBase
{
    private readonly OllamaApiClient _ollamaApiClient;

    public AIChatController(OllamaApiClient ollamaApiClient)
    {
        _ollamaApiClient = ollamaApiClient;
    }
    
    [HttpGet("Chat")]
    public async Task Chat()
    {
    #pragma warning disable SKEXP0001
        var history = new List<Message>();
        history.Add(new Message()
        {
            Role = ChatRole.System,
            Content = "you are a useful assistant",
        });
        history.Add(new Message()
        {
            Role = ChatRole.User,
            Content = "hello",
        });
    
        var req = new OllamaSharp.Models.Chat.ChatRequest()
        {
            Messages = history,
            Stream = true
        };
    
        var sb = new StringBuilder();
        var content = _ollamaApiClient.ChatAsync(req);
    
        await foreach (var chatMessageContent in content)
        {
            var msg = chatMessageContent?.Message.Content;
            sb.Append(msg);
            Console.Write(msg);
            await Response.WriteAsync($"data: {msg}\n\n");
            await Response.Body.FlushAsync();
        }
    }
}

响应:

Hello! It's nice to meet you. I'm here to assist you with any questions, tasks, or just about anything you'd like to chat about. What's on your mind today?

Moonhost接入示例

注册

Program.cs

var MoonshotAIKey = "sk-2xyIeQ49Xl714yquKkMrIdvsuI4aZmnvgNHHKxEaXkk384Os";
var endpoint = new Uri("https://api.moonshot.cn/v1");
var modelId = "moonshot-v1-8k";
var kernelBuilder = Kernel.CreateBuilder()
      .AddOpenAIChatCompletion(modelId: modelId!, apiKey: MoonshotAIKey, endpoint: endpoint, httpClient: new HttpClient());

[Route("api/[controller]")]
[ApiController]
public class AIChatController : ControllerBase
{
    private readonly Kernel _kernel;
    public AIChatController(Kernel kernel) 
    {
        _kernel = kernel
    }
    
    /// <summary>
    /// MoonShot
    /// </summary>
    /// <returns></returns>
    [HttpGet("MoonShotChat")]
    public async Task MoonShotChat()
    {
        var settings = new OpenAIPromptExecutionSettings()
        {
            Temperature = 0,
            ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions
        };

        var history=new ChatHistory();
        history.AddSystemMessage("you are a useful assistant");
        history.AddUserMessage("hello");
        
        var chatCompletionService=_kernel.GetRequiredService<IChatCompletionService>();
        var result=await chatCompletionService.GetChatMessageContentAsync(history,settings,_kernel);

        System.Console.WriteLine(result.ToString());
        //Hello! How can I help you today?
    }
}
posted @   贰拾~  阅读(52)  评论(0编辑  收藏  举报
相关博文:
阅读排行:
· 阿里最新开源QwQ-32B,效果媲美deepseek-r1满血版,部署成本又又又降低了!
· 单线程的Redis速度为什么快?
· SQL Server 2025 AI相关能力初探
· AI编程工具终极对决:字节Trae VS Cursor,谁才是开发者新宠?
· 展开说说关于C#中ORM框架的用法!
点击右上角即可分享
微信分享提示