Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DeepSpeed支持yaml配置文件 #6445

Open
randydl opened this issue Dec 25, 2024 · 0 comments
Open

DeepSpeed支持yaml配置文件 #6445

randydl opened this issue Dec 25, 2024 · 0 comments
Labels
pending This problem is yet to be addressed

Comments

@randydl
Copy link

randydl commented Dec 25, 2024

这里提供一个简单的方案:

my_train.py

import sys
from omegaconf import OmegaConf
from llamafactory.train.tuner import run_exp


def parse_args():
    results = []
    for arg in sys.argv[1:]:
        if not arg.startswith('-') and arg.endswith('.yaml'):
            content = OmegaConf.load(arg)
            for key, value in content.items():
                results.append(f'--{key}={value}')
        else:
            results.append(arg)
    return results


if __name__ == '__main__':
    args = parse_args()
    sys.argv = [sys.argv[0]] + args
    run_exp()

my_run.sh

#!/bin/bash

deepspeed \
    --num_gpus 8 \
    --num_nodes 2 \
    --hostfile hostfile \
    --master_addr 10.252.32.12 \
    my_train.py "$@"

Run bellow code:

bash my_run.sh examples/train_lora/llama3_lora_sft_ds3.yaml
@github-actions github-actions bot added the pending This problem is yet to be addressed label Dec 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pending This problem is yet to be addressed
Projects
None yet
Development

No branches or pull requests

1 participant