8000 fix 142457 , fixes double free corruption by adding TORCH_CHECK to ensure weights have the proper size by AmalDevHaridevan · Pull Request #148620 · pytorch/pytorch · GitHub
[go: up one dir, main page]

Skip to content

fix 142457 , fixes double free corruption by adding TORCH_CHECK to ensure weights have the proper size #148620

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

AmalDevHaridevan
Copy link
@AmalDevHaridevan AmalDevHaridevan commented Mar 5, 2025

Fixes #142457

Problem

slow_conv_transpose3d_shape_check currently does not enforce the constraint that the size of the weight = n_input_plane x n_output_plane x kernel_depth x kernel_height x kernel_width. This causes the undefined behavior seen in issue 142457. This can be fixed by enforcing that the sizes of weight at dims 3, 4, and 5 equals the sizes of kernel at dims 1, 2, and 3.

Fix

Added 3 TORCH_CHECKs to meet the above constraint.

Test

Reproduction code

import torch

self = torch.full((1, 2, 4, 5, 4,), 0.5, dtype=torch.double)
weight = torch.full((2, 3, 2, 3, 2,), 0.5, dtype=torch.double)
kernel_size = [1, 1, 1]
bias = torch.full((3,), 0.5, dtype=torch.double)
stride = [1, 1, 1]
padding = [2, 2, 2]
output_padding = [2, 2, 2]
dilation = [1879048192, 1879048192, 1879048192]
torch.ops.aten.slow_conv_transpose3d(self, weight, kernel_size, bias, stride, padding, output_padding, dilation)

Before fix

double free or corruption (!prev)
Aborted (core dumped)

After fix

Traceback (most recent call last):
  File "/home/system/Desktop/pytorch_contrib/pytorch/../test.py", line 11, in <module>
    torch.ops.aten.slow_conv_transpose3d(self, weight, kernel_size, bias, stride, padding, output_padding, dilation)
  File "/home/system/Desktop/pytorch_contrib/pytorch/torch/_ops.py", line 1158, in __call__
    return self._op(*args, **(kwargs or {}))
RuntimeError: Expected weight to have size 1 at dimension 3 but got 2

More verification tests

import torch

self = torch.full((1, 2, 4, 5, 4,), 0.5, dtype=torch.double)
weight = torch.full((2, 3, 2, 3, 2,), 0.5, dtype=torch.double)
kernel_size = [2, 3, 2,]
bias = torch.full((3,), 0.5, dtype=torch.double)
stride = [1, 1, 1]
padding = [2, 2, 2]
output_padding = [0, 0, 0]
dilation = [1, 1, 1]
res1  = torch.ops.aten.slow_conv_transpose3d(self, weight, kernel_size, bias, stride, padding, output_padding, dilation)
module = torch.nn.ConvTranspose3d(2, 3, kernel_size=kernel_size, stride=stride, padding=padding, output_padding=output_padding, dilation=dilation, bias=True)
module.weight = torch.nn.Parameter(weight)
module.bias = torch.nn.Parameter(bias)
res2 = module(self)
assert torch.allclose(res1, res2)
print("Success")

Copy link
pytorch-bot bot commented Mar 5, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/148620

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit b3b46b5 with merge base d6d670a (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@AmalDevHaridevan AmalDevHaridevan changed the title fix 142457 , fixes double free corruption by adding TORCH_CHECK to esure weights have the proper size fix 142457 , fixes double free corruption by adding TORCH_CHECK to ensure weights have the proper size Mar 5, 2025
@bdhirsh bdhirsh requested a review from mikaylagawarecki March 6, 2025 01:53
@bdhirsh bdhirsh added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Mar 6, 2025
@AmalDevHaridevan
Copy link
Author

@pytorchbot label "topic: not user facing"

@pytorch-bot pytorch-bot bot added the topic: not user facing topic category label Mar 6, 2025
@AmalDevHaridevan
Copy link
Author

@mikaylagawarecki Any updates on the PR 😁😃?

Copy link
Contributor

Looks like this PR hasn't been updated in a while so we're going to go 888D ahead and mark this as Stale.
Feel free to remove the Stale label if you feel this was a mistake.
If you are unable to remove the Stale label please contact a maintainer in order to do so.
If you want the bot to never mark this PR stale again, add the no-stale label.
Stale pull requests will automatically be closed after 30 days of inactivity.

@github-actions github-actions bot added the Stale label May 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
open source Stale topic: not user facing topic category triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Aborted (core dumped) in slow_conv_transpose3d
3 participants
0