/opt/miniconda3/envs/llama_xyj/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104cuda9SetDeviceEi
摘要:
安装以下whl: https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.6/flash_attn-2.5.6+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.wh 阅读全文
posted @ 2024-06-13 18:18 swuxyj 阅读(756) 评论(0) 推荐(0) 编辑