8000 [sparse][semi-structured] Fix RuntimeError when passing in non-contiguous input to SparseSemiStructured linear by jcaip · Pull Request #114593 · pytorch/pytorch · GitHub
[go: up one dir, main page]

Skip to content

[sparse][semi-structured] Fix RuntimeError when passing in non-contiguous input to SparseSemiStructured linear #114593

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

jcaip
Copy link
Contributor
@jcaip jcaip commented Nov 27, 2023

Summary:

This PR also brings in changes from #105595, which are needed for the changes in #110420

Currently, PyTorch incorrectly calculates the size of the returned matrix when we pass a non-contiguous batched (>2d) input to the semi-structured sparse subclass.

This is most common in MLP layers, where we have 2 linear layers back to back.

This will lead to an error like the following:

RuntimeError: shape '[20, 64, 64, 3072]' is invalid for input of size
62914560

Where the size of the sparse matmul result is off because we infer the output shape with the wrong tensor shape.

This happens because of a bug where we did not update the subclass tensor shape when doing transpose.
For semi-structured sparsity, transposing is a no-op where we just set the boolean flag, but we forgot to also update the tensor shape.

Note that this error goes away in inference mode, since we avoid decomposing the aten.linear op and handle shape folding ourselves, which changes the execution path.

An alternative way to fix this issue is to set
TORCH_FLATTEN_LINEAR_3D=True, which will also fix this error.

Test Plan:

python test/test_sparse_semi_structured.py -k test_mlp

Reviewers:

Subscribers:

Tasks:

Tags:
Pull Request resolved: #110420 Approved by: https://github.com/alexsamardzic, https://github.com/cpuhrsch

alexsamardzic and others added 2 commits November 27, 2023 05:49
Summary:

Currently, PyTorch incorrectly calculates the size of the returned
matrix when we pass a non-contiguous batched (>2d) input to the
semi-structured sparse subclass.

This is most common in MLP layers, where we have 2 linear layers back to back.

This will lead to an error like the following:
```
RuntimeError: shape '[20, 64, 64, 3072]' is invalid for input of size
62914560

```
Where the size of the sparse matmul result is off because we infer the
output shape with the wrong tensor shape.

This happens because of a bug where we did not update the subclass
tensor shape when doing transpose.
For semi-structured sparsity, transposing is a no-op where we just set
the boolean flag, but we forgot to also update the tensor shape.

Note that this error goes away in inference mode, since we avoid
decomposing the aten.linear op and handle shape folding ourselves,
which changes the execution path.

An alternative way to fix this issue is to set
TORCH_FLATTEN_LINEAR_3D=True, which will also fix this error.

Test Plan:
```
python test/test_sparse_semi_structured.py -k test_mlp

```

Reviewers:

Subscribers:

Tasks:

Tags:
Pull Request resolved: #110420
Approved by: https://github.com/alexsamardzic, https://github.com/cpuhrsch
@pytorch-bot pytorch-bot bot added the release notes: sparse release notes category label Nov 27, 2023
Copy link
pytorch-bot bot commented Nov 27, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/114593

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 4091997 with merge base 138e289 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

Copy link
Contributor

Looks like this PR hasn't been updated in a while so we're going to go ahead and mark this as Stale.
Feel free to remove the Stale label if you feel this was a mistake.
If you are unable to remove the Stale label please contact a maintainer in order to do so.
If you want the bot to never mark this PR stale again, add the no-stale label.
Stale pull requests will automatically be closed after 30 days of inactivity.

@github-actions github-actions bot added the Stale label Jan 26, 2024
@jcaip jcaip closed this Feb 12, 2024
@github-actions github-actions bot deleted the jcaip/semi-structured-sparse-shape-mismatch-bugfix-2.1.2 branch March 14, 2024 01:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
release notes: sparse release notes category Stale
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants
0