ollama 离线安装 (linux)
1、离线安装
https://github.com/ollama/ollama/blob/main/docs/linux.md
2、模型默认位置
/usr/share/ollama/.ollama/models
参考
https://blog.csdn.net/hooksten/article/details/145418987
Linux
Install
To install Ollama, run the following command:
curl -fsSL https://ollama.com/install.sh | sh
Manual install
[!NOTE]
If you are upgrading from a prior version, you should remove the old libraries withsudo rm -rf /usr/lib/ollama
first.
Download and extract the package:
curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz sudo tar -C /usr -xzf ollama-linux-amd64.tgz
Start Ollama:
ollama serve
In another terminal, verify that Ollama is running:
ollama -v
AMD GPU install
If you have an AMD GPU, also download and extract the additional ROCm package:
curl -L https://ollama.com/download/ollama-linux-amd64-rocm.tgz -o ollama-linux-amd64-rocm.tgz sudo tar -C /usr -xzf ollama-linux-amd64-rocm.tgz
ARM64 install
Download and extract the ARM64-specific package:
curl -L https://ollama.com/download/ollama-linux-arm64.tgz -o ollama-linux-arm64.tgz sudo tar -C /usr -xzf ollama-linux-arm64.tgz
Adding Ollama as a startup service (recommended)
Create a user and group for Ollama:
sudo useradd -r -s /bin/false -U -m -d /usr/share/ollama ollama sudo usermod -a -G ollama $(whoami)
Create a service file in /etc/systemd/system/ollama.service
:
[Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=$PATH" [Install] WantedBy=default.target
Then start the service:
sudo systemctl daemon-reload sudo systemctl enable ollama
Install CUDA drivers (optional)
Download and install CUDA.
Verify that the drivers are installed by running the following command, which should print details about your GPU:
nvidia-smi
Install AMD ROCm drivers (optional)
Download and Install ROCm v6.
Start Ollama
Start Ollama and verify it is running:
sudo systemctl start ollama sudo systemctl status ollama
[!NOTE]
While AMD has contributed theamdgpu
driver upstream to the official linux
kernel source, the version is older and may not support all ROCm features. We
recommend you install the latest driver from
https://www.amd.com/en/support/linux-drivers for best support of your Radeon
GPU.
Customizing
To customize the installation of Ollama, you can edit the systemd service file or the environment variables by running:
sudo systemctl edit ollama
Alternatively, create an override file manually in /etc/systemd/system/ollama.service.d/override.conf
:
[Service] Environment="OLLAMA_DEBUG=1"
Updating
Update Ollama by running the install script again:
curl -fsSL https://ollama.com/install.sh | sh
Or by re-downloading Ollama:
curl -L https://ollama.com/download/ollama-linux-amd64.tgz -o ollama-linux-amd64.tgz sudo tar -C /usr -xzf ollama-linux-amd64.tgz
Installing specific versions
Use OLLAMA_VERSION
environment variable with the install script to install a specific version of Ollama, including pre-releases. You can find the version numbers in the releases page.
For example:
curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.7 sh
Viewing logs
To view logs of Ollama running as a startup service, run:
journalctl -e -u ollama
Uninstall
Remove the ollama service:
sudo systemctl stop ollama sudo systemctl disable ollama sudo rm /etc/systemd/system/ollama.service
Remove the ollama binary from your bin directory (either /usr/local/bin
, /usr/bin
, or /bin
):
sudo rm $(which ollama)
Remove the downloaded models and Ollama service user and group:
sudo rm -r /usr/share/ollama sudo userdel ollama sudo groupdel ollama
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· TypeScript + Deepseek 打造卜卦网站:技术与玄学的结合
· Manus的开源复刻OpenManus初探
· AI 智能体引爆开源社区「GitHub 热点速览」
· 从HTTP原因短语缺失研究HTTP/2和HTTP/3的设计差异
· 三行代码完成国际化适配,妙~啊~