-
Notifications
You must be signed in to change notification settings - Fork 24.2k
[ONNX] Migrate torchlib from onnxscript #139301
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Migrate the ATen op decomp library for ONNX ("torchlib") from ONNX Script with necessary changes in the onnx registry. The migration"torchlib" is what we the decomp library from aten ops to onnx in the torchlib was hosted in
Test and infra impactThe tests leverage OpInfo. They run in an onnx shard only. Added tests finish within 40 seconds on a 10-core local machine. No new dependencies are introduced. Packaging, test activities should remain the same. State of the migrated codeThe migrated code are lifted from https://github.com/microsoft/onnxscript/tree/main/onnxscript/function_libs/torch_lib. It is reviewed by the same set of reviewers owning the Next stepsThe follow up PRs will decouple the implementation from ONNX Script type system to improve type checking and bump the onnx opset version used. |
#139301 Pull Request resolved: #147391 Approved by: https://github.com/titaiwangms
This PR sets up the registry to accept onnx decomp functions to be moved into PyTorch (#139301). The ops from onnx script are currently appended to the registry. When the ops are moved into PyTorch, the moved ops takes precedence because they appear first in the registry list. After the migration hooks for loading ops from onnx script will be removed. 1. Use a private field `_pt_onnx_signature` to store function signatures to avoid conflicts 2. Update the registry to record the signature in OnnxDecompMeta and update the dispatcher to leverage the data structure 3. Update registry to prepare for onnx op registration, and update the the onnx_impl decorator to support a no_compile option Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com> Pull Request resolved: #147396 Approved by: https://github.com/titaiwangms
Create scaffold for onnx op test data and common logic. This PR creates the scaffolding for new onnx decomp functions described in #139301. It adds two ops: abs and add, and enables the related tests. #139301 Pull Request resolved: #147392 Approved by: https://github.com/titaiwangms ghstack dependencies: #147396
#139301 Pull Request resolved: #147391 Approved by: https://github.com/titaiwangms
This PR sets up the registry to accept onnx decomp functions to be moved into PyTorch (#139301). The ops from onnx script are currently appended to the registry. When the ops are moved into PyTorch, the moved ops takes precedence because they appear first in the registry list. After the migration hooks for loading ops from onnx script will be removed. 1. Use a private field `_pt_onnx_signature` to store function signatures to avoid conflicts 2. Update the registry to record the signature in OnnxDecompMeta and update the dispatcher to leverage the data structure 3. Update registry to prepare for onnx op registration, and update the the onnx_impl decorator to support a no_compile option Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com> Pull Request resolved: #147396 Approved by: https://github.com/titaiwangms
Create scaffold for onnx op test data and common logic. This PR creates the scaffolding for new onnx decomp functions described in #139301. It adds two ops: abs and add, and enables the related tests. #139301 Pull Request resolved: #147392 Approved by: https://github.com/titaiwangms ghstack dependencies: #147396
#139301 Pull Request resolved: #147391 Approved by: https://github.com/titaiwangms
This PR sets up the registry to accept onnx decomp functions to be moved into PyTorch (pytorch#139301). The ops from onnx script are currently appended to the registry. When the ops are moved into PyTorch, the moved ops takes precedence because they appear first in the registry list. After the migration hooks for loading ops from onnx script will be removed. 1. Use a private field `_pt_onnx_signature` to store function signatures to avoid conflicts 2. Update the registry to record the signature in OnnxDecompMeta and update the dispatcher to leverage the data structure 3. Update registry to prepare for onnx op registration, and update the the onnx_impl decorator to support a no_compile option Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com> Pull Request resolved: pytorch#147396 Approved by: https://github.com/titaiwangms
Create scaffold for onnx op test data and common logic. This PR creates the scaffolding for new onnx decomp functions described in #139301. It adds two ops: abs and add, and enables the related tests. #139301 Pull Request resolved: #147392 Approved by: https://github.com/titaiwangms ghstack dependencies: #147396
Thanks for the details @justinchuby ! That being said, this kind of major migration is sometimes needed but they are always disruptive to users and maintainers and so need care.
For the particular ask here, I would say:
|
Thanks! For the points your shared:
I fully agree with the goals here.
Currently it is completing the new onnx exporter that works on FX. So the logic itself is not "replacing" anything. The logic, along with the whole export flow, will replace the jit/torchscript based onnx exporter. It is as removable as torchscript in torch (aka. not immediately clear because of the vast number of legacy dependents). But we are encouraging users to migrate to the new flow.
Sure - that is done according to your suggestion:
That is because the CI runtime in onnxscript is significantly smaller (on the order of 2-5 minutes) to allow faster iteration. We also had more frequent releases so users will be able to use the latest updates sooner than they could with PyTorch. In short, every one of the PRs was peer-reviewed w/ the same (high) quality we hold among ourselves and can be traced in onnxscript. The logic is also used in production.
The logic, along with the whole export flow, will replace the jit/torchscript based onnx exporter. It is as removable as torchscript in torch (aka. not immediately clear because of the vast number of legacy dependents). But we are encouraging users to migrate to the new flow. |
@albanD let us know if the info above addresses some of your concerns? The migration will also avoid issues like microsoft/onnxscript#2085 which affects development in core. |
) pytorch#139301 Pull Request resolved: pytorch#147391 Approved by: https://github.com/titaiwangms
This PR sets up the registry to accept onnx decomp functions to be moved into PyTorch (pytorch#139301). The ops from onnx script are currently appended to the registry. When the ops are moved into PyTorch, the moved ops takes precedence because they appear first in the registry list. After the migration hooks for loading ops from onnx script will be removed. 1. Use a private field `_pt_onnx_signature` to store function signatures to avoid conflicts 2. Update the registry to record the signature in OnnxDecompMeta and update the dispatcher to leverage the data structure 3. Update registry to prepare for onnx op registration, and update the the onnx_impl decorator to support a no_compile option Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com> Pull Request resolved: pytorch#147396 Approved by: https://github.com/titaiwangms
…h#147392) Create scaffold for onnx op test data and common logic. This PR creates the scaffolding for new onnx decomp functions described in pytorch#139301. It adds two ops: abs and add, and enables the related tests. pytorch#139301 Pull Request resolved: pytorch#147392 Approved by: https://github.com/titaiwangms ghstack dependencies: pytorch#147396
Since there is not much movement of the discussion and we are blocked with the migration, I am going to create split-up PRs and handle the migration piecemeal instead. cc @MaanavD @xadupre @gramalingam @titaiwangms |
Now that the torchlib translation library is close to stable in onnxscript, it is ready to be migrated into PyTorch so that
Considerations
The text was updated successfully, but these errors were encountered: