LLaMA Factory环境配置
LLaMA-Factory官方文档安装正确的torch和cuda版本
参考:
PyTorch
报错办理
1.ImportError: /usr/lib/x86_64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.29` not found
参考这个办理:丝滑办理ImportError: /usr/lib/x86_64-linux-gnu/libstdc++.so.6: version `GLIBCXX_3.4.29‘ not found题目
2.flash_attn_2_cuda.cpython-311-x86_64-linux-gnu.so: undefined symbol: _ZN3c105ErrorC2ENS_14SourceLocationENSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE
参考这个重新安装flash-attention:
[已办理] flash-attn报错flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol-CSDN博客
安装flash-attention
3.python3.11/site-packages/torch/lib/../../nvidia/cusparse/lib/libcusparse.so.12: symbol __nvJitLinkComplete_12_4 version libnvJitLink.so.12 not defined in file libnvJitLink.so.12 with link time reference
torch包坏了,重装torch
4.cannot import name 'log' from 'torch.distributed.elastic.agent.server.api'
成功办理ImportError: cannot import name ‘log‘ from ‘torch.distributed.elastic.agent.server.api‘_importerror: cannot import name 'log' from 'torch.-CSDN博客
免责声明:如果侵犯了您的权益,请联系站长,我们会及时删除侵权内容,谢谢合作!更多信息从访问主页:qidao123.com:ToB企服之家,中国第一个企服评测及商务社交产业平台。
页:
[1]