-
Notifications
You must be signed in to change notification settings - Fork 24.3k
torch.nonzero(t, as_tuple=...) does not work with the JIT because the as 8000 _tuple signatures are not exposed properly #45499
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
currently it's hard to support this op because the return type depends on the value of |
Or maybe we should remove the kwarg and deprecate functions that return different types of objects based on their input values. |
That also sounds good to me! I don't know whether nonzero was on the path to deprecation, or what would be the eager concerns around deprecating it outside of the JIT. I know that numpy has quite a large number of functions that have an output type dependent on the input type. The mechanism to return a different type dependent on input type is implemented for python, just not for builtins : Line 782 in bb19a55
|
@eellison what did you do for |
Same issue as #38718 |
@ngimel it uses the dispatching method we have implemented in |
Just ran into this as well. Quite annoying. Looking forward to a fix. |
I can give guidance on how this might be fixed but I don't know if I have bandwidth to fix this. Adding back to triage. |
Could you write up a little about how to fix it in details? Thanks! |
To do this "the right way" I would : Add Literal Types -
But this might be more trouble than it's worth. We might consider throwing an error, exposing the tuple/non-tuple signatures, and asking the user to either directly invoke the unique_tuple vs unique overloads (those may or may not exist rn, i'm not sure). |
Where by |
It does seem worth doing - this has already come up for |
Any updates? It has been a year since #38718 is opened, and it once was part of the 1.6.0 milestone (#38718 (comment)) |
Thanks @eellison for proposing a potential solution. Indeed it looks more trouble (introducing Literal type) than it is worth. Besides, since the output tuple type depends on rank of input tensor, it doesn't seem we can handle the What if we model the output as |
Note that the Python Array API specifies nonzero return a tuple by default (see https://data-apis.org/array-api/latest/API_specification/searching_functions.html#nonzero-x), although we may be able to argue it should return a tensor, @ngimel, @rgommers |
From data-apis/array-api#23: The reason apparently was that it's easier for advanced indexing (you can then simply do |
Without tuple, you can do |
True, |
I investigated the state some more, a bit of history in NumPy on There were a number of issues related to In practice, there's not much of a difference one way or the other - it's just that PyTorch made a different choice than other libraries for return type early on. It's not going to change in the array API or in NumPy. So for that aspect, I think the choice is between PyTorch remaining different, or doing a hard BC-breaking change. There's something to say for the former here. EDIT: discussion on data-apis/array-api#23 continued after the above, there's also |
Uh oh!
There was an error while loading. Please reload this page.
This is because of the Python arg parsing done to handle the "as_tuple" kwarg. Note that tracing works fine.
cc @mruberry @rgommers @heitorschueroff @pmeier @gmagogsfm
The text was updated successfully, but these errors were encountered: