三尺非寒 发表于 2025-2-20 05:45:52

LLama factory接入wandb面板

https://i-blog.csdnimg.cn/direct/7fe7ff62885146368ba917870c97e280.png
   https://github.com/hiyouga/LLaMA-Factory/blob/main/README_zh.md
step1:先自己在wandb注册一个账号,拿到key

详见:WandB 简明教程【Weights & Bias】
step2:开始设置

阐明!!我是在代码行里运行有问题,所以直接演示网页端

[*]webui起来后,举行设置,wandb是在这里
https://i-blog.csdnimg.cn/direct/2bc8d5c777ec4c2b862070e3a5a0f719.png
[*]然后预览下令 会出现一个report_to: all就对了,然后直接运行
llamafactory-cli train \
    --stage sft \
    --do_train True \
    --model_name_or_path {模型路径} \
    --preprocessing_num_workers 16 \
    --finetuning_type lora \
    --template llama3 \
    --flash_attn auto \
    --dataset_dir data \
    --dataset {数据集路径} \
    --cutoff_len 1024 \
    --learning_rate 5e-05 \
    --num_train_epochs 3.0 \
    --max_samples 100000 \
    --per_device_train_batch_size 2 \
    --gradient_accumulation_steps 8 \
    --lr_scheduler_type cosine \
    --max_grad_norm 1.0 \
    --logging_steps 5 \
    --save_steps 100 \
    --warmup_steps 0 \
    --optim adamw_torch \
    --packing False \
    --report_to all \
    --output_dir {输出文件路径} \
    --bf16 True \
    --plot_loss True \
    --ddp_timeout 180000000 \
    --include_num_input_tokens_seen True \
    --lora_rank 8 \
    --lora_alpha 16 \
    --lora_dropout 0 \
    --lora_target all

[*]运行到最后让你选择w&b使用模式,选2,回车
wandb: Using wandb-core as the SDK backend.Please refer to https://wandb.me/wandb-core for more information.
wandb: (1) Create a W&B account
wandb: (2) Use an existing W&B account
wandb: (3) Don't visualize my results
wandb: Enter your choice: 2

[*]输入你在官网拿的的key即可
wandb: You chose 'Use an existing W&B account'
wandb: Logging into wandb.ai. (Learn how to deploy a W&B server locally: https://wandb.me/wandb-server)
wandb: You can find your API key in your browser here: https://wandb.ai/authorize
wandb: Paste an API key from your profile and hit enter, or press ctrl+c to quit:

[*]最后点
页: [1]
查看完整版本: LLama factory接入wandb面板