8000 Custom autograd functions are not inlined when export mode is ONNX_ATEN_FALLBACK · Issue #85027 · pytorch/pytorch · GitHub
[go: up one dir, main page]

Skip to content

Custom autograd functions are not inlined when export mode is ONNX_ATEN_FALLBACK #85027

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
natke opened this issue Sep 14, 2022 · 2 comments
Closed
Assignees
Labels
module: onnx Related to torch.onnx triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@natke
Copy link
natke commented Sep 14, 2022

🐛 Describe the bug

I am working with a partner, who are using operator_export_type=OperatorExportTypes.ONNX_ATEN_FALLBACK to export a greater number of models without error with the trade off of running inference with the PyTorch aten library included.

They are trying to add support for models that contain custom autograd functions, and have been testing with PyTorch nightly (since this PR went in: #74765).

What we would like to see is:

When operator_export_type=OperatorExportTypes.ONNX_ATEN_FALLBACK and

  1. All functionality is supported by ONNX
    then any custom autograd functions are inlined and the graph is exported with all ONNX ops

  2. When there are aten ops in the custom autograd function
    then any custom autograd functions are inlined and the graph is exported with the aten ops

What we observe currently:

  1. operator_export_type=OperatorExportTypes.ONNX

    If the functionality in the custom autograd function is supported by ONNX ops then the graph is exported to ONNX correctly.

    Export fails if the custom autograd has an aten op with the error message “Could not export operator aten:XXX”

  2. operator_export_type=OperatorExportTypes.ONNX_FALLTHROUGH

    The custom autograd functions is exported as a PythonOp node whether or not the functionality of the custom autograd functions is supported by ONNX.

  3. operator_export_type=OperatorExportTypes.ONNX_ATEN_FALLBACK

    The export fails with the error “Couldn’t export Python operator ”

Small sample here demonstrating the above findings: https://github.com/natke/custom_autograd/blob/main/model.py.

Versions

Python platform: macOS-10.16-x86_64-i386-64bit
Is CUDA available: False
CUDA runtime version: No CUDA
GPU models and configuration: No CUDA
Nvidia driver version: No CUDA
cuDNN version: No CUDA
HIP runtime version: N/A
MIOpen runtime version: N/A
Is XNNPACK available: True

Versions of relevant libraries:
[pip3] numpy==1.23.1
[pip3] torch==1.13.0.dev20220908
[pip3] torchaudio==0.13.0.dev20220908
[pip3] torchvision==0.14.0.dev20220908
[conda] blas                      1.0                         mkl  
[conda] mkl                       2021.4.0           hecd8cb5_637  
[conda] mkl-service               2.4.0            py39h9ed2024_0  
[conda] mkl_fft                   1.3.1            py39h4ab4a9b_0  
[conda] mkl_random                1.2.2            py39hb2f4e1b_0  
[conda] numpy                     1.23.1           py39h2e5f0a9_0  
[conda] numpy-base                1.23.1           py39h3b1a694_0  
[conda] pytorch                   1.13.0.dev20220908         py3.9_0    pytorch-nightly
[conda] torchaudio                0.13.0.dev20220908        py39_cpu    pytorch-nightly
[conda] torchvision               0.14.0.dev20220908        py39_cpu    pytorch-nightly
@natke natke changed the title Custom atuograd functions are not inlined when export mode is ONNX_ATEN_FALLBACK Custom autograd functions are not inlined when export mode is ONNX_ATEN_FALLBACK Sep 14, 2022
@shubhambhokare1 shubhambhokare1 self-assigned this Sep 14, 2022
@dagitses dagitses added module: onnx Related to torch.onnx triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module labels Sep 16, 2022
pytorchmergebot pushed a commit that referenced this issue Oct 14, 2022
@abock abock added this to ONNX Jun 14, 2023
@github-project-automation github-project-automation bot moved this to Inbox in ONNX Jun 14, 2023
@thiagocrepaldi
Copy link
Collaborator

@natke is this still relevant or we can close it?

@natke
Copy link
Author
natke commented May 1, 2024

I think we fixed it?

@github-project-automation github-project-automation bot moved this from Inbox to Done in ONNX May 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: onnx Related to torch.onnx triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.

4 participants
0