8000 Allow to byteswap data when reading saved torch jit data by AlekseiNikiforovIBM · Pull Request #151447 · pytorch/pytorch · GitHub
[go: up one dir, main page]

Skip to content

Allow to byteswap data when reading saved torch jit data #151447

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

AlekseiNikiforovIBM
Copy link
Collaborator
@AlekseiNikiforovIBM AlekseiNikiforovIBM commented Apr 16, 2025

It looks like some pickled data is endian-dependent. Byteswap such data when needed.

Add testcases.

Fixes #151428

cc @EikanWang @jgong5 @wenzhe-nrv @sanchitintel

Copy link
pytorch-bot bot commented Apr 16, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/151447

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 9e564e7 with merge base 604467d (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@pytorch-bot pytorch-bot bot added the release notes: jit release notes category label Apr 16, 2025
@facebook-github-bot facebook-github-bot added the oncall: jit Add this issue/PR to JIT oncall triage queue label Apr 16, 2025
Copy link
Contributor
@malfet malfet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please do not embed large files into .py, just store them as artifacts (it looks like you are adding nearly 1500 lines to test_save_load.py)

@AlekseiNikiforovIBM
Copy link
Collaborator Author

Sure, I'll update tests.

@AlekseiNikiforovIBM
Copy link
Collaborator Author

Tests are updated and build is fixed. Please take a look.

@albanD albanD added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Apr 17, 2025
@albanD albanD requested a review from mikaylagawarecki April 17, 2025 13:31
@mikaylagawarecki mikaylagawarecki requested review from davidberard98 and removed request for mikaylagawarecki May 1, 2025 18:39
@mikaylagawarecki
Copy link
Contributor

@davidberard98 tagging you for torch.jit (sorry, not sure whether you're still the right person here)

@AlekseiNikiforovIBM
Copy link
Collaborator Author

Could you please take a look at this PR again?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
oncall: jit Add this issue/PR to JIT oncall triage queue open source release notes: jit release notes category triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

TorchScript Model Saved on x86 Returns NaNs When Loaded on s390x
6 participants
0