Hello World

这个是标题,但是为什么要有标题

这个是子标题,但是为什么要有子标题

/opt/miniconda3/envs/llama_xyj/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi

安装以下whl:

https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.6/flash_attn-2.5.6+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl; 

 

报错:

RuntimeError: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback):

/opt/miniconda3/envs/llama_xyj/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi

 

发现是本地的pytorch版本对不上,lash_attn-2.5.6+cu122torch2.2cxxxx 需要torch2.2,因此直接

 pip install torch==2.2.0

即可解决。

 

posted on 2024-06-13 18:18  swuxyj  阅读(756)  评论(0编辑  收藏  举报

导航

Hello World