8000 Update on "Add autograd hook for python rpc call" · pytorch/pytorch@c0cd280 · GitHub
[go: up one dir, main page]

Skip to content

Commit c0cd280

Browse files
committed
Update on "Add autograd hook for python rpc call"
1. currently if autograd context is FE53 valid, even tensors do not require grads and grads function are not attached. it still send rpc with autograd meta. This is not ideal. This diff makes some change to make sure rpc with autograd meta is sent only if autograd context is valid and tensors require grads 2. meanwhile create a utiliy to attach autograd info and functions as needed 3. add autograd send/recv functions for python rpc call 4. make changes to support nested python rpc calls 5. disallow nested dist autograd context (was landed in #27022) Differential Revision: [D17819153](https://our.internmc.facebook.com/intern/diff/D17819153/) [ghstack-poisoned]
1 parent 4dcc83c commit c0cd280

File tree

1 file changed

+2
-2
lines changed
  • torch/csrc/distributed/autograd

1 file changed

+2
-2
lines changed

torch/csrc/distributed/autograd/utils.h

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -36,13 +36,13 @@ TORCH_API DistAutogradContext* addRecvRpcBackward(
3636
// and attach autograd function for each type of rpc call if it has valid
3737
// context and tensors require grads, in this case, return RpcWithAutograd
3838
// message; otherwise return original rpc message.
39-
rpc::Message getMessageWithAutogradCheck(
39+
TORCH_API rpc::Message getMessageWithAutogradCheck(
4040
const rpc::worker_id_t dstId,
4141
rpc::Message&& wrappedRpcMsg,
4242
rpc::MessageType msgType);
4343

4444
// Send message after autograd checking
45-
std::shared_ptr<torch::distributed::rpc::FutureMessage> sendMessage(
45+
TORCH_API std::shared_ptr<torch::distributed::rpc::FutureMessage> sendMessage(
4646
rpc::RpcAgent& agent,
4747
const rpc::WorkerInfo& dst,
4848
rpc::Message&& wrappedRpcMsg,

0 commit comments

Comments
 (0)
0