-
Notifications
You must be signed in to change notification settings - Fork 24.2k
Support complex type in ONNX export #59246
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I have confirmed this is definitely a complex number issue as I get the same error substituting the following into the previous example def model(self):
return torch.complex(torch.tensor(1.), torch.tensor(1.)) |
@david-macleod Thanks for reporting! Indeed this is where the error arises, and there could be other places that need to be checked and updated, like shape type inference and symbolic functions, to support complex computation. We will take a look and evaluate what are needed. |
Is there any way to export the network with complex float now? |
@sunwell1994 This is being tracked internally. We are evaluating the design for both exporter and ONNX wrt complex types and operators. Could you try working around the issue with unwrapping complex type to two separate tensors for the moment? |
Tracked internally at Microsoft by https://msdata.visualstudio.com/Vienna/_workitems/edit/1442180 |
Complex types is a known limitation for the ONNX exporter. Although there is no estimate on when this could be resolved, it is being tracked internally |
Any update about this issue? |
Any updates? |
Complex support is added in the |
Ops related to FFT may have the error. It can be supported now? Whether can be solved with some method? |
I came to the same error when convert LaMa model to onnx . And I think the method is related to FFT ops. I tried to use torch.onnx.dynamo_export to solve it. But I found pytorch 2.0.1 can't support dynamo_export. Any advice to solve it. Thanks! |
I'm on 2.3.0.dev20240219+cu121 and I still get the same error. |
+1 |
+1, any update on the complex support? |
+1, torch2.3.0 use torch.onnx.export also had this problem. |
Is it supported with https://pytorch.org/docs/stable/onnx_dynamo.html or not? |
🐛 Bug
When exporting a model with
torch.onnx.export
I receive the following errorThe error is being raised from here and I am guessing the cause of the issue is that the return type from the op is
ComplexFloat
(see trace beow) which doesn't seem to be covered by the switch statement (unless that is actually built from a more primitive type).However I also see reference to
ComplexFloat
in symbolic_helper.py here which suggests it it supported?To Reproduce
Expected behavior
Successful export of ONNX graph with
ComplexFloat
output type.Environment
cc @houseroad @spandantiwari @lara-hdr @BowenBao @neginraoof @SplitInfinity
The text was updated successfully, but these errors were encountered: