chainlit stream 模式简单说明

chatapp stream 模式还是很重要的,现在的llm api 基本都支持stream模式了chainlit 也提供了stream 模式的支持

参考处理

import chainlit as cl
from openai import AsyncOpenAI
 
client = AsyncOpenAI(
    api_key="sk-ZTp5zuetNQoJNgG4xHgGzw",
    base_url="http://localhost:4000"
)
 
settings = {
    "model": "dalongdemov3",
    "temperature": 0,
}
 
@cl.on_message
async def on_message(message: cl.Message):
    response = await client.chat.completions.create(
       # api 开启stream 
        stream= True,
        messages=[
            {
                "content": "You are a helpful bot, you always reply in chinese.",
                "role": "system"
            },
            {
                "content": message.content,
                "role": "user"
            }
        ],
        **settings
    ) 
    msg = cl.Message(content="")
    await msg.send()
    async for token in  response:
       # stream_token 是核心
        await msg.stream_token(token.choices[0].delta.content or "")
    await msg.update()
 
@cl.on_chat_start
async def main():
    await cl.Message(content="你好").send()

说明

以上是一个简单说明,官方文档也有详细的示例可以参考学习

参考资料

https://docs.chainlit.io/advanced-features/streaming

posted on 2024-08-12 00:06  荣锋亮  阅读(12)  评论(0编辑  收藏  举报

导航