LitServe 快速部署类openai 接口服务

以前简单说明过LitServe 支持openai 接口格式的服务api,以下是一个简单试用

参考代码

  • app.py
import litserve as ls

class SimpleLitAPI(ls.LitAPI):
    def setup(self, device):
        self.model = None

    def predict(self, prompt):
        # `prompt` is a list of dictionary containing role and content
        # example: [{'role': 'user', 'content': 'How can I help you today?'}]
        yield "This is a sample generated output"

if __name__ == "__main__":
    # Enable the OpenAISpec in LitServer
    api = SimpleLitAPI()
    server = ls.LitServer(api, spec=ls.OpenAISpec())
    server.run(port=8000)
  • 运行效果

  • 接口调用
curl -X 'POST' \
  'http://0.0.0.0:8000/v1/chat/completions' \
  -H 'accept: application/json' \
  -H 'Content-Type: application/json' \
  -d '{
  "model": "",
  "messages": [
    {
      "role": "user",
      "content": "demo"
    }
  ]
}'

  • stream 模式
import litserve as ls

class SimpleLitAPI(ls.LitAPI):
    def setup(self, device):
        self.model = None

    def predict(self, prompt):
        # 基于生成器进行数据生成
        for chunk in "This is a sample generated output".split():
            yield chunk

if __name__ == "__main__":
    api = SimpleLitAPI()
    server = ls.LitServer(api, spec=ls.OpenAISpec())
    server.run(port=8000)

说明

基于LitServe快速开发ai 推理服务接口还是很方便的,快速、简单,让我们将核心能力放到业务处理上

参考资料

https://lightning.ai/docs/litserve/features/open-ai-spec

https://github.com/Lightning-AI/LitServe

https://github.com/Lightning-AI/litdata

posted on 2024-11-08 08:00  荣锋亮  阅读(18)  评论(0编辑  收藏  举报

导航