-
Notifications
You must be signed in to change notification settings - Fork 204
Closed
Description
I have a Module that computes a Tensor vector from a Tensor vector. I wish to compute the gradient of each output element with respect to each input element. In the Pytorch documentation, they show this can be done through the use of torch.autograd.functional.jacobian
(see torch.autograd.functional.jacobian). I'm not sure the same can be done using TorchSharp.
I tried looping over the elements of the output tensor and doing a backward
call:
for (int i = 0; i < output.Length; i++)
output[i].backward(create_graph: false, retain_graph: true);
But even with retain_graph=true
that results in the error:
System.Runtime.InteropServices.ExternalException: 'Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.
Is there currently already a way to compute the jacobian of a function/module in TorchSharp? If not, can you suggest how I could get started on adding support for the above Pytorch functionality?
Metadata
Metadata
Assignees
Labels
No labels