8000 Adam doesn't work with nonzero-dim Tensor betas · Issue #147921 · pytorch/pytorch · GitHub
[go: up one dir, main page]

Skip to content

Adam doesn't work with nonzero-dim Tensor betas #147921

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
Tony-Y opened this issue Feb 26, 2025 · 0 comments · May be fixed by #149939
Open

Adam doesn't work with nonzero-dim Tensor betas #147921

Tony-Y opened this issue Feb 26, 2025 · 0 comments · May be fixed by #149939
Labels
module: optimizer Related to torch.optim triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@Tony-Y
Copy link
Contributor
Tony-Y commented Feb 26, 2025

🐛 Describe the bug

This bug was pointed out at #145461 (comment). The PR #145674 fixed the Tensor lr issue, but not the Tensor betas issue.

Versions

The same as #145461

cc @vincentqb @jbschlosser @albanD @janeyx99 @crcrpar

@drisspg drisspg added module: optimizer Related to torch.optim triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Mar 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: optimizer Related to torch.optim triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants
0