huggingface-cli 用法

安装 huggingface-cli

First, make sure you have hugginface-cli installed:

pip install -U "huggingface_hub[cli]"

概论

Then, you can target the specific file you want:

huggingface-cli download bartowski/Meta-Llama-3.1-8B-Instruct-GGUF --include "Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf" --local-dir ./

If the model is bigger than 50GB, it will have been split into multiple files. In order to download them all to a local folder, run:

huggingface-cli download bartowski/Meta-Llama-3.1-8B-Instruct-GGUF --include "Meta-Llama-3.1-8B-Instruct-Q8_0.gguf/*" --local-dir Meta-Llama-3.1-8B-Instruct-Q8_0

You can either specify a new local-dir (Meta-Llama-3.1-8B-Instruct-Q8_0) or download them all in place (./)

默认全部下载,指定文件只要加空格,后面写上需要的文件名就可以。

指定文件夹,用 --include 参数

排除文件夹,用 --exclude 参数

不包含某个目录的文件

hugingface-cli download meta-llama/Meta-Llama-3.1-70B-Instruct --exclude "original/*" --local-dir meta-llama/Meta-Llama-3.1-70B-Instruct

只包含某个文件

huggingface-cli download bartowski/Meta-Llama-3.1-8B-Instruct-GGUF --include "Meta-Llama-3.1-8B-Instruct-Q4_K_M.gguf" --local-dir ./

只包含某个文件夹

huggingface-cli download bartowski/Meta-Llama-3.1-8B-Instruct-GGUF --include "Meta-Llama-3.1-8B-Instruct-Q8_0.gguf/*" --local-dir Meta-Llama-3.1-8B-Instruct-Q8_0

posted @ 2024-08-20 09:00  立体风  阅读(191)  评论(0编辑  收藏  举报