8000 FIX Prefix tuning test w/ rotary emb on multi GPU by BenjaminBossan · Pull Request #2311 · huggingface/peft · GitHub
[go: up one dir, main page]

Skip to content

FIX Prefix tuning test w/ rotary emb on multi GPU #2311

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

BenjaminBossan
Copy link
Member
@BenjaminBossan BenjaminBossan commented Jan 7, 2025

See huggingface/transformers#35235 (comment) for context.

There has been a refactor in transformers that resulted in the rotary embedding of Mistral (and probably others) moving to the model level. This led to a device map used in one of the tests to being incorrect and thus a failing nightly CI. This PR fixes the device map.

Notes:

  • This fix doesn't really have anything to do with prefix tuning, the error occurred even before prefix tuning is used.
  • The error won't be caught by the PR CI, as it requires 2 GPUs to run and thus is only tested in the nightly CI.

See
huggingface/transformers#35235 (comment)
for context.

There has been a refactor in transformers that resulted in the rotary
embedding of Mistral (and probably others) moving to the model level.
This led to a device map used in one of the tests to being incorrect.
This PR fixes the device map.

Note that this fix doesn't really have anything to do with prefix
tuning, the error occurred even before prefix tuning is used.
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@githubnemo githubnemo merged commit 0b0ff9a into huggingface:main Jan 10, 2025
13 of 14 checks passed
@BenjaminBossan BenjaminBossan deleted the fix-test-rotary-emb-multi-gpu branch January 10, 2025 14:29
Guy-Bilitski pushed a commit to Guy-Bilitski/peft that referenced this pull request May 13, 2025
See
huggingface/transformers#35235 (comment)
for context.

There has been a refactor in transformers that resulted in the rotary
embedding of Mistral (and probably others) moving to the model level.
This led to a device map used in one of the tests to being incorrect.
This PR fixes the device map.

Note that this fix doesn't really have anything to do with prefix
tuning, the error occurred even before prefix tuning is used.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants
0