ollama + ollama web + fastapi app (langchain) demo
ollama + ollama web + fastapi app (langchain) demo
https://github.com/fanqingsong/ollama-docker
Welcome to the Ollama Docker Compose Setup! This project simplifies the deployment of Ollama using Docker Compose, making it easy to run Ollama with all its dependencies in a containerized environment.
ollama
https://python.langchain.com/docs/integrations/llms/ollama/Ollama
Ollama allows you to run open-source large language models, such as Llama 2, locally.
Ollama bundles model weights, configuration, and data into a single package, defined by a Modelfile.
It optimizes setup and configuration details, including GPU usage.
For a complete list of supported models and model variants, see the Ollama model library.
同义模型
https://ollama.com/library/qwen
meta模型
https://ollama.com/library/llama2
ollama镜像预置模型:
https://github.com/FultonBrowne/ollama-docker
ollama + ollamaweb 容器化部署方案
https://github.com/lgdd/chatollama
LangChain
https://python.langchain.com/docs/get_started/introduction
LangChain is a framework for developing applications powered by large language models (LLMs).
LangChain simplifies every stage of the LLM application lifecycle:
- Development: Build your applications using LangChain's open-source building blocks and components. Hit the ground running using third-party integrations and Templates.
- Productionization: Use LangSmith to inspect, monitor and evaluate your chains, so that you can continuously optimize and deploy with confidence.
- Deployment: Turn any chain into an API with LangServe.
Concretely, the framework consists of the following open-source libraries:
langchain-core
: Base abstractions and LangChain Expression Language.langchain-community
: Third party integrations.
- Partner packages (e.g.
langchain-openai
,langchain-anthropic
, etc.): Some integrations have been further split into their own lightweight packages that only depend onlangchain-core
.langchain
: Chains, agents, and retrieval strategies that make up an application's cognitive architecture.- langgraph: Build robust and stateful multi-actor applications with LLMs by modeling steps as edges and nodes in a graph.
- langserve: Deploy LangChain chains as REST APIs.
The broader ecosystem includes:
- LangSmith: A developer platform that lets you debug, test, evaluate, and monitor LLM applications and seamlessly integrates with LangChain.
API
https://api.python.langchain.com/en/latest/langchain_api_reference.html#
integration of ollama and langchain
https://python.langchain.com/docs/integrations/llms/ollama/#via-langchainfrom langchain_community.llms import Ollama llm = Ollama(model="llama3") llm.invoke("Tell me a joke")
RAG
https://zhuanlan.zhihu.com/p/695140853
https://github.com/fanqingsong/ollama-docker/blob/main/src/rag.py
https://github.com/fanqingsong/DocQA/blob/main/app.py
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· 全网最简单!3分钟用满血DeepSeek R1开发一款AI智能客服,零代码轻松接入微信、公众号、小程
· .NET 10 首个预览版发布,跨平台开发与性能全面提升
· 《HelloGitHub》第 107 期
· 全程使用 AI 从 0 到 1 写了个小工具
· 从文本到图像:SSE 如何助力 AI 内容实时呈现?(Typescript篇)
2021-05-04 poetry - Dependency Management for Python & Package & Publish -- a NPM-like tool of nodejs
2020-05-04 英语流利说Level4 test reference
2018-05-04 DataBase vs Data Warehouse
2014-05-04 给对象添加方法和属性