8000 MPS: Datatype precedence in binary ops [WIP] by lhoenig · Pull Request #78319 · pytorch/pytorch · GitHub
[go: up one dir, main page]

Skip to content

MPS: Datatype precedence in binary ops [WIP] #78319

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 23 commits into from
Closed
Changes from 1 commit
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
update comments with new information
  • Loading branch information
lhoenig committed May 28, 2022
commit f86d924aaa38df7159dcb08153d160e636961d83
3 changes: 2 additions & 1 deletion test/test_mps.py
Original file line number Diff line number Diff line change
Expand Up @@ -1309,7 +1309,7 @@ def test_binops_dtype_precedence(self):
torch.int16: [-15, 0, 1, 10],
torch.int32: [-376, 0, 1, 13],
torch.int64: [-8, 0, 1, 77],
# torch.float16: [-234.5, 0.0, 1.0, 2.0], # TODO: Broken, currently unknown why
# torch.float16: [-234.5, 0.0, 1.0, 2.0], # TODO: Only add/sub work correctly, currently unknown why
torch.float32: [-1.0, 0.0, 0.1, 111.99],
}
# Test all combinations of dtypes, operations, dimensionality
Expand Down Expand Up @@ -1353,6 +1353,7 @@ def test_binops_dtype_precedence(self):
# Multiple problems with [MPSGraph constantWithScalar:shape:dataType:] prevent
# these tests from completing successfully currently
# TODO: Research problem with int16, is it also related to constantWithScalar?
# - Likely, getting result tensor with 5/10 correct entries
# TODO: Stateful bug with False, False, add in assert5? Related to the cache key
# or more serious problem?
# - Cache key looks correct, behavior currently completely unexplained
Expand Down
0