-
Notifications
You must be signed in to change notification settings - Fork 24.7k
Closed
Labels
actionablemodule: autogradRelated to torch.autograd, and the autograd engine in generalRelated to torch.autograd, and the autograd engine in generalmodule: forward adtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
The goal here is to add forward AD support to the following functions (more can be added if needed):
- clamp Add forward AD support for clamp.Tensor #74042
- atan2
- polar
- logcumsumexp
A good example of a PR adding such support is in #66092
For the testing, if the op being updated is not in OpInfo, then an ad-hoc test in test_autograd.py that calls gradcheck for one sample input with a TODO saying that it should be removed when we have OpInfo for that op.
cc @ezyang @albanD @zou3519 @gqchen @pearu @nikitaved @soulitzer @lezcano @Varal7
Metadata
Metadata
Assignees
Labels
actionablemodule: autogradRelated to torch.autograd, and the autograd engine in generalRelated to torch.autograd, and the autograd engine in generalmodule: forward adtriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module