ollama 最快方式部署管理大模型

github:https://github.com/ollama/ollama

模型地址:https://ollama.com/library/llama3.1

linux: 安装

1.下载安装脚本

curl -fsSL https://ollama.com/install.sh | sh

2.修改启动环境变量

如果是root 用户记得改为root

vim /etc/systemd/system/ollama.service

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=root
Group=root
Restart=always
RestartSec=3
Environment="PATH=/root/.nvm/versions/node/v18.20.4/bin:/home/miniconda3/bin:/home/miniconda3/condabin:/usr/lib64/qt-3.3/bin:/root/perl5/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin:/usr/local/mysql/bin"
Environment="OLLAMA_DEBUG=1"
Environment="OLLAMA_HOST=0.0.0.0:11434"
Environment=" OLLAMA_KEEP_ALIVE=5h"
Environment="OLLAMA_MAX_LOADED_MODELS=10"
#export OLLAMA_MAX_QUEUE=100
Environment="OLLAMA_MODELS=/home/data/llm/ollama/models/"
[Install]
WantedBy=default.target

2.相关命令

(base) [root@ceph1 ~]# ollama 
Usage:
  ollama [flags]
  ollama [command]

Available Commands:
  serve       Start ollama
  create      Create a model from a Modelfile
  show        Show information for a model
  run         Run a model
  pull        Pull a model from a registry
  push        Push a model to a registry
  list        List models
  ps          List running models
  cp          Copy a model
  rm          Remove a model
  help        Help about any command

Flags:
  -h, --help      help for ollama
  -v, --version   Show version information

Use "ollama [command] --help" for more information about a command.

3. 执行模型运行

o ollama run llama3.1:70b

首次执行会下载模型到 环境变量Environment="OLLAMA_MODELS=/home/data/llm/ollama/models/"

第二次就不用了下载执行 开始运行

 

 

posted @ 2024-08-29 17:07  夜半钟声到客船  阅读(292)  评论(0编辑  收藏  举报