-
Notifications
You must be signed in to change notification settings - Fork 24.3k
[ROCM] Properly disable Flash Attention/Efficient Attention with environment variables #133866
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/133866
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (3 Unrelated Failures)As of commit 298fee6 with merge base df68315 ( FLAKY - The following job failed but was likely due to flakiness present on trunk:
BROKEN TRUNK - The following jobs failed but was present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM. @xinyazhang Have you done any local testing with a build that has them disabled?
I run a local |
CI build failure is real.
|
@jeffdaily It is expected if AOTriton installation doesn't exist (which I think it's the case for the CI image)
Both should be OFF to disable AOTriton. |
Okay I think I found the problem. Within |
Otherwise we have circular dependencies.
e767d4b
to
693d4e3
Compare
@malfet , |
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
The merge job was canceled or timed out. This most often happen if two merge requests were issued for the same PR, or if merge job was waiting for more than 6 hours for tests to finish. In later case, please do not hesitate to reissue the merge command |
@jithunnair-amd it seems the merge was blocked by failing CI, which is supposed be fixed by #133884 |
@pytorchbot merge -f "Build issues resolved. This PR is for build scenarios not relevant to CI. Test failures are related to GQA which is addressed in #133884." |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
…tion with environment variables (#1570) Now `USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py` can compile correctly. This is cherry-picked version of pytorch#133866
…tion with environment variables (#1571) Now `USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py` can compile correctly. This is cherry-picked version of pytorch#133866 --------- Co-authored-by: Pruthvi Madugundu <pruthvigithub@gmail.com>
…ronment variables (#1542) Now `USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py` can compile correctly. This is cherry-picked version of pytorch#133866 Tested with `USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py develop --user` and `python -c 'import torch'`
…ronment variables (#133866) Now `USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py` can compile correctly Fixes #125230 Pull Request resolved: #133866 Approved by: https://github.com/jithunnair-amd, https://github.com/jeffdaily, https://github.com/malfet
…ronment variables (pytorch#133866) Now `USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py` can compile correctly Fixes pytorch#125230 Pull Request resolved: pytorch#133866 Approved by: https://github.com/jithunnair-amd, https://github.com/jeffdaily, https://github.com/malfet
…tion with environment variables (#1570) Now `USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py` can compile correctly. This is cherry-picked version of pytorch#133866
Now
USE_FLASH_ATTENTION=0 USE_MEM_EFF_ATTENTION=0 python setup.py
can compile correctlyFixes #125230
cc @jeffdaily @sunway513 @jithunnair-amd @pruthvistony @ROCmSupport @dllehr-amd @jataylo @hongxiayang