8000 Exporting the operator 'aten::fft_fft2' to ONNX opset version 18 is not supported. · Issue #98833 · pytorch/pytorch · GitHub
[go: up one dir, main page]

Skip to content

Exporting the operator 'aten::fft_fft2' to ONNX opset version 18 is not supported. #98833 < 8000 /h1>
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Radyngzh opened this issue Apr 11, 2023 · 9 comments
Labels
module: onnx Related to torch.onnx triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@Radyngzh
Copy link

Issue description

Exporting the operator 'aten::fft_fft2' to ONNX opset version 18 is not supported.
Trying to convert torch model to onnx model.
How can I solve this problem?

  • PyTorch version: 2.0.0
  • onnx version: 1.13.1
  • Python version: 3.8.10
  • CUDA/cuDNN version: 11.2
  • GPU models and configuration: RTX 3090 24G
@malfet malfet added the module: onnx Related to torch.onnx label Apr 11, 2023
@ngimel ngimel added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Apr 11, 2023
@shushanxingzhe
Copy link

Encounter a similar problem, Waiting for reply...
Exporting the operator 'aten::fft_rfft' to ONNX opset version 14 is not supported

@abock abock added this to ONNX Jun 14, 2023
@github-project-automation github-project-automation bot moved this to Inbox in ONNX Jun 14, 2023
@Pondiniii
Copy link

Can confirm that PyTorch 2.0.1 and PyTorch nightly are still not supported. I used opset version 18.

@justinchuby
Copy link
Collaborator

Opset 18 and fft will be supported by the dynamo exporter. Closed as a duplicate of #107588

@justinchuby justinchuby closed this as not planned Won't fix, can't repro, duplicate, stale Aug 24, 2023
@github-project-automation github-project-automation bot moved this from Inbox to Done in ONNX Aug 24, 2023
@EmreOzkose
Copy link
EmreOzkose commented Oct 12, 2023

My torch version is 2.0.1+cu117 and still cannot export.

@justinchuby should I use another export function (I am using torch.onnx.export) to utilize dynamo ?

@justinchuby

This comment has been minimized.

@EvelynCarter
Copy link

解决了吗 怎么解决的

@1334926196
Copy link

May I ask about 'aten': Does fft_fft2 'not export to onnx format solve the problem

@1334926196
Copy link

问题描述

不支持将运算符 'aten::fft_fft2' 导出到 ONNX opset 版本 18。尝试将 torch 模型转换为 onnx 模型。我该如何解决这个问题?

  • PyTorch 版本:2.0.0
  • ONNX 版本:1.13.1
  • Python 版本:3.8.10
  • CUDA/cuDNN 版本:11.2
  • GPU 型号及配置:RTX 3090 24G

May I ask about 'aten': Does fft_fft2 'not export to onnx format solve the problem

@justinchuby
Copy link
Collaborator

Please test with torch.onnx.export(..., dynamo=True, report=True) using the latest torch-nightly. Attach the generated report if there is an error. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: onnx Related to torch.onnx triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
Status: Done
Development

No branches or pull requests

9 participants
0