10000 hotfix train_lora deepspeed problem by ericzhou571 · Pull Request #2003 · lm-sys/FastChat · GitHub
[go: up one dir, main page]

Skip to content

hotfix train_lora deepspeed problem #2003

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

ericzhou571
Copy link
Contributor
@ericzhou571 ericzhou571 commented Jul 19, 2023

@ZYHowell

Why are these changes needed?

fix the problem that Trainer do not have deepspeed related attribute when deepspeed is not enabled. #1458 (comment)

Closes #1458

Checks

  • I've run format.sh to lint the changes in this PR.
  • I've included any doc changes needed.
  • I've made sure the relevant tests are passing (if applicable).

@@ -180,7 +180,7 @@ def train():
trainer.save_state()

# check if zero3 mode enabled
if deepspeed.is_deepspeed_zero3_enabled():
if trainer.args.deepspeed and trainer.hf_deepspeed_config_orig.is_zero3():
Copy link
Collaborator
@BabyChouSr BabyChouSr Jul 19, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe that deepspeed.is_deepspeed_zero3_enabled() is used because of transformers version >=4.30.0. The line trainer.hf_deepspeed_config_orig.is_zero3() works on 4.28 but does not exist in 4.30.0 - I'm assuming it got removed after 4.30.0.

If we want to conform to transformers 4.30.0, I would suggest keeping it as deepspeed.is_deepspeed_zero3_enabled()

@merrymercy
Copy link
Member

We will migrate to transformers 4.30 soon #2016
@BabyChouSr @ericzhou571 Is this PR still required?

@BabyChouSr
Copy link
Collaborator

I don't think so since it looks like the original user's error has to deal with the incorrect transformers version with code. With transformers >=4.30.0, the training script should work.

@merrymercy
Copy link
Member

Close this for now.

@merrymercy merrymercy closed this Jul 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Issue with Zero3 Mode and State Dictionary Saving - Related to Issue 1271
4 participants
0