解决ValueError: Tokenizer class LLaMATokenizer does not exist or is not currently imported

## 问题:

load LLaMA 7b的weights的时候报错:

ValueError: Tokenizer class LLaMATokenizer does not exist or is not currently imported.

## 出现原因:

新版transformers里面llama的tokenizer命名为LlamaTokenizer

但是旧的模型里面的tokenizer叫LLaMATokenizer

## 解决方案:

改动transformers源码中三个位置:

utils/dummy_sentencepiece_objects.py

models/auto/tokenization_aotu.py

__init__.py

在这三个文件中找到LlamaTokenizer, 改为LLaMATokenizier

参考:github.com/mbehm/transformers/tree/main/src/transformers

注:找到当前环境库源码,用 print(transformers.__file__)

 

posted @ 2023-07-20 09:54  mnluzimu  阅读(2321)  评论(0编辑  收藏  举报