8000 Update on "Add autograd hook for python rpc call" · pytorch/pytorch@5994e17 · GitHub
[go: up one dir, main page]

Skip to content

Commit 5994e17

Browse files
committed
Update on "Add autograd hook for python rpc call"
1. currently if autograd context is valid, even tensors do not require grads and grads function are not attached. it still send rpc with autograd meta. This is not ideal. This diff makes some change to make sure rpc with autograd meta is sent only if autograd context is valid and tensors require grads 2. meanwhile create a utiliy to attach autograd info and functions as needed 3. add autograd send/recv functions for python rpc call 4. make changes to support nested python rpc calls 5. disallow nested dist autograd context (was landed in #27022) Differential Revision: [D17819153](https://our.internmc.facebook.com/intern/diff/D17819153/) [ghstack-poisoned]
1 parent a7c43f1 commit 5994e17

File tree

1 file changed

+0
-1
lines changed

1 file changed

+0
-1
lines changed

test/dist_autograd_test.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,6 @@
33
import time
44
import unittest
55

6-
from enum import IntEnum
76
import torch
87
import torch.distributed.autograd as dist_autograd
98
import torch.distributed.rpc as rpc

0 commit comments

Comments
 (0)
0