llama-3.1下载摆设
下载huggingface
详情页填写申请后等待审核
https://i-blog.csdnimg.cn/direct/8519c613c9be479bbc5f47e5cae20883.png
点击 头像->setting->access token 创建token
https://i-blog.csdnimg.cn/direct/3b54a6e06f1e45c4adda82dc9c7abba0.png
配置环境变量
https://i-blog.csdnimg.cn/direct/18cb9b93c5e3405ea3e645979129bd2c.png#pic_center
下载模型
pip install -U huggingface_hub
huggingface-cli download --resume-download meta-llama/Meta-Llama-3.1-8B-Instruct --local-dir E:\codes\model\meta-llama\Meta-Llama-3.1-8B-Instruct --local-dir-use-symlinks False --resume-download --token xxxxx
https://i-blog.csdnimg.cn/direct/d6c0f941c0234cf3838383f89222ee44.png
对于Linux系统
export HF_ENDPOINT=https://hf-mirror.com
huggingface-cli download --resume-download meta-llama/Meta-Llama-3.1-8B-Instruct --local-dir /home/model/meta-llama/Meta-Llama-3.1-8B-Instruct --local-dir-use-symlinks False --resume-download --token xxxxx
使用wget一次下载单个文件
wget --header "Authorization: Bearer 你的token" https://hf-mirror.com/meta-llama/Meta-Llama-3.1-8B/resolve/main/model-00003-of-00004.safetensors
摆设
环境python3.10
pip install torch==2.1.1 torchvision==0.16.1 torchaudio==2.1.1 --index-url https://download.pytorch.org/whl/cu121
pip install transformers==4.43.2 numpy==1.26.4 bitsandbytes==0.43.3 accelerate==0.33.0 -i https://pypi.tuna.tsinghua.edu.cn/simple
from transformers import pipeline
import torch
model_id = r"E:\codes\model\meta-llama\Meta-Llama-3.1-8B-Instruct"
# pipeline = pipeline(
# "text-generation",
# model=model_id,
# model_kwargs={"torch_dtype": torch.bfloat16},
# device_map="auto",
# )
pipeline = pipeline(
"text-generation",
model=model_id,
model_kwargs={
"torch_dtype": torch.bfloat16,
"quantization_config": {"load_in_4bit": True}
},
)
messages = [
{"role": "system", "content": "You are a pirate chatbot who always responds in pirate speak!"},
{"role": "user", "content": "Who are you?"},
]
outputs = pipeline(
messages,
max_new_tokens=256,
)
print(outputs["generated_text"][-1])
免责声明:如果侵犯了您的权益,请联系站长,我们会及时删除侵权内容,谢谢合作!更多信息从访问主页:qidao123.com:ToB企服之家,中国第一个企服评测及商务社交产业平台。
页:
[1]