Stay Hungry,Stay Foolish!

ollama + ollama web + fastapi app (langchain) demo

ollama + ollama web + fastapi app (langchain) demo

https://github.com/fanqingsong/ollama-docker

Welcome to the Ollama Docker Compose Setup! This project simplifies the deployment of Ollama using Docker Compose, making it easy to run Ollama with all its dependencies in a containerized environment.

 

ollama

https://python.langchain.com/docs/integrations/llms/ollama/

Ollama

Ollama allows you to run open-source large language models, such as Llama 2, locally.

Ollama bundles model weights, configuration, and data into a single package, defined by a Modelfile.

It optimizes setup and configuration details, including GPU usage.

For a complete list of supported models and model variants, see the Ollama model library.

 

同义模型

https://ollama.com/library/qwen

 

meta模型

https://ollama.com/library/llama2

 

ollama镜像预置模型:

https://github.com/FultonBrowne/ollama-docker

 

ollama + ollamaweb 容器化部署方案

https://github.com/lgdd/chatollama

 

LangChain

https://python.langchain.com/docs/get_started/introduction

 

LangChain is a framework for developing applications powered by large language models (LLMs).

LangChain simplifies every stage of the LLM application lifecycle:

Diagram outlining the hierarchical organization of the LangChain framework, displaying the interconnected parts across multiple layers.

Concretely, the framework consists of the following open-source libraries:

  • langchain-core: Base abstractions and LangChain Expression Language.
  • langchain-community: Third party integrations.
    • Partner packages (e.g. langchain-openai, langchain-anthropic, etc.): Some integrations have been further split into their own lightweight packages that only depend on langchain-core.
  • langchain: Chains, agents, and retrieval strategies that make up an application's cognitive architecture.
  • langgraph: Build robust and stateful multi-actor applications with LLMs by modeling steps as edges and nodes in a graph.
  • langserve: Deploy LangChain chains as REST APIs.

The broader ecosystem includes:

  • LangSmith: A developer platform that lets you debug, test, evaluate, and monitor LLM applications and seamlessly integrates with LangChain.

 

API

https://api.python.langchain.com/en/latest/langchain_api_reference.html#

 

integration of ollama and langchain

https://python.langchain.com/docs/integrations/llms/ollama/#via-langchain
from langchain_community.llms import Ollama

llm = Ollama(model="llama3")

llm.invoke("Tell me a joke")

 

 

RAG

https://zhuanlan.zhihu.com/p/695140853

 

https://github.com/fanqingsong/ollama-docker/blob/main/src/rag.py

 

https://github.com/fanqingsong/DocQA/blob/main/app.py

 

posted @   lightsong  阅读(388)  评论(0编辑  收藏  举报
相关博文:
阅读排行:
· 全网最简单!3分钟用满血DeepSeek R1开发一款AI智能客服,零代码轻松接入微信、公众号、小程
· .NET 10 首个预览版发布,跨平台开发与性能全面提升
· 《HelloGitHub》第 107 期
· 全程使用 AI 从 0 到 1 写了个小工具
· 从文本到图像:SSE 如何助力 AI 内容实时呈现?(Typescript篇)
历史上的今天:
2021-05-04 poetry - Dependency Management for Python & Package & Publish -- a NPM-like tool of nodejs
2020-05-04 英语流利说Level4 test reference
2018-05-04 DataBase vs Data Warehouse
2014-05-04 给对象添加方法和属性
千山鸟飞绝,万径人踪灭
点击右上角即可分享
微信分享提示