8000 [MTIA Aten Backend] Migrate "_unsafe_view" and "view" ops from out-of-tree to pytorch in-tree by andyanwang · Pull Request #153670 · pytorch/pytorch · GitHub
[go: up one dir, main page]

Skip to content

[MTIA Aten Backend] Migrate "_unsafe_view" and "view" ops from out-of-tree to pytorch in-tree #153670

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

andyanwang
Copy link

Summary:

Context

The MTIA New Aten Backend work is essentially to move MTIA operators from pytorch out-of-tree to in-tree, with following benefits:

  1. Avoid duplicate code copied from pytorch, e.g. view ops implementation, util functions.
  2. Utilize TensorIterator and structured kernel codegen, avoid manual implementation of broadcasting, dtype casting, asserting, etc.
  3. Eliminate MTIA's own codegen flow, which is unnecessary complexity.
  4. Overall make MTIA's aten backend more pytorch native.

Differential Revision: D74672464

Copy link
pytorch-bot bot commented May 15, 2025

This appears to be a diff that was exported from phabricator, but the PR author does not have sufficient permissions to run CI. @andyanwang, please do step 2 of internal wiki to get write access so you do not need to get CI approvals in the future. If you think this is a mistake, please contact the Pytorch Dev Infra team.

Copy link
linux-foundation-easycla bot commented May 15, 2025

CLA Not Signed

Copy link
pytorch-bot bot commented May 15, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/153670

Note: Links to docs will display an error until the docs builds have been completed.

✅ You can merge normally! (1 Unrelated Failure)

As of commit 44086b6 with merge base 3aa8477 (image):

UNSTABLE - The following job is marked as unstable, possibly due to flakiness on trunk:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D74672464

andyanwang added a commit to andyanwang/pytorch that referenced this pull request May 15, 2025
…-tree to pytorch in-tree (pytorch#153670)

Summary:

# Context
The MTIA New Aten Backend work is essentially to move MTIA operators from pytorch out-of-tree to in-tree, with following benefits:
1. Avoid duplicate code copied from pytorch, e.g. view ops implementation, util functions.
2. Utilize TensorIterator and structured kernel codegen, avoid manual implementation of broadcasting, dtype casting, asserting, etc.
3. Eliminate MTIA's own codegen flow, which is unnecessary complexity.
4. Overall make MTIA's aten backend more pytorch native.

Differential Revision: D74672464
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D74672464

Copy link
Contributor

Attention! native_functions.yaml was changed

If you are adding a new function or defaulted argument to native_functions.yaml, you cannot use it from pre-existing Python frontend code until our FC window passes (two weeks). Split your PR into two PRs, one which adds the new C++ functionality, and one that makes use of it from Python, and land them two weeks apart. See https://github.com/pytorch/pytorch/wiki/PyTorch's-Python-Frontend-Backward-and-Forward-Compatibility-Policy#forwards-compatibility-fc for more info.


Caused by:

Copy link
pytorch-bot bot commented May 16, 2025

❌ 🤖 pytorchbot command failed:

Got EOF while in a quoted string```
Try `@pytorchbot --help` for more info.

@andyanwang
Copy link
Author

@pytorchbot label "topic: not user facing"

@pytorch-bot pytorch-bot bot added the topic: not user facing topic category label May 16, 2025
…-tree to pytorch in-tree (pytorch#153670)

Summary:

# Context
The MTIA New Aten Backend work is essentially to move MTIA operators from pytorch out-of-tree to in-tree, with following benefits:
1. Avoid duplicate code copied from pytorch, e.g. view ops implementation, util functions.
2. Utilize TensorIterator and structured kernel codegen, avoid manual implementation of broadcasting, dtype casting, asserting, etc.
3. Eliminate MTIA's own codegen flow, which is unnecessary complexity.
4. Overall make MTIA's aten backend more pytorch native.

Differential Revision: D74672464
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D74672464

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants
0