Stay Hungry,Stay Foolish!

ollama + ollama web + fastapi app (langchain) demo

ollama + ollama web + fastapi app (langchain) demo

https://github.com/fanqingsong/ollama-docker

Welcome to the Ollama Docker Compose Setup! This project simplifies the deployment of Ollama using Docker Compose, making it easy to run Ollama with all its dependencies in a containerized environment.

 

ollama

https://python.langchain.com/docs/integrations/llms/ollama/

Ollama

Ollama allows you to run open-source large language models, such as Llama 2, locally.

Ollama bundles model weights, configuration, and data into a single package, defined by a Modelfile.

It optimizes setup and configuration details, including GPU usage.

For a complete list of supported models and model variants, see the Ollama model library.

 

同义模型

https://ollama.com/library/qwen

 

meta模型

https://ollama.com/library/llama2

 

ollama镜像预置模型:

https://github.com/FultonBrowne/ollama-docker

 

ollama + ollamaweb 容器化部署方案

https://github.com/lgdd/chatollama

 

LangChain

https://python.langchain.com/docs/get_started/introduction

 

LangChain is a framework for developing applications powered by large language models (LLMs).

LangChain simplifies every stage of the LLM application lifecycle:

Diagram outlining the hierarchical organization of the LangChain framework, displaying the interconnected parts across multiple layers.

Concretely, the framework consists of the following open-source libraries:

  • langchain-core: Base abstractions and LangChain Expression Language.
  • langchain-community: Third party integrations.
    • Partner packages (e.g. langchain-openai, langchain-anthropic, etc.): Some integrations have been further split into their own lightweight packages that only depend on langchain-core.
  • langchain: Chains, agents, and retrieval strategies that make up an application's cognitive architecture.
  • langgraph: Build robust and stateful multi-actor applications with LLMs by modeling steps as edges and nodes in a graph.
  • langserve: Deploy LangChain chains as REST APIs.

The broader ecosystem includes:

  • LangSmith: A developer platform that lets you debug, test, evaluate, and monitor LLM applications and seamlessly integrates with LangChain.

 

API

https://api.python.langchain.com/en/latest/langchain_api_reference.html#

 

integration of ollama and langchain

https://python.langchain.com/docs/integrations/llms/ollama/#via-langchain
from langchain_community.llms import Ollama

llm = Ollama(model="llama3")

llm.invoke("Tell me a joke")

 

 

RAG

https://zhuanlan.zhihu.com/p/695140853

 

https://github.com/fanqingsong/ollama-docker/blob/main/src/rag.py

 

https://github.com/fanqingsong/DocQA/blob/main/app.py

 

posted @ 2024-05-04 23:28  lightsong  阅读(167)  评论(0编辑  收藏  举报
Life Is Short, We Need Ship To Travel