8000 torch.finfo doesn't work for complex · Issue #35954 · pytorch/pytorch · GitHub
[go: up one dir, main page]

Skip to content

torch.finfo doesn't work for complex #35954

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
anjali411 opened this issue Apr 3, 2020 · 15 comments
Closed

torch.finfo doesn't work for complex #35954

anjali411 opened this issue Apr 3, 2020 · 15 comments
Labels
good first issue module: complex Related to complex number support in PyTorch triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@anjali411
Copy link
Contributor
anjali411 commented Apr 3, 2020

import torch
torch.finfo(torch.complex64)
Traceback (most recent call last):
File "", line 1, in
TypeError: torch.finfo() requires a floating point input type. Use torch.iinfo to handle 'torch.finfo'
import numpy
numpy.finfo(numpy.complex64)
finfo(resolution=1e-06, min=-3.4028235e+38, max=3.4028235e+38, dtype=float32)

cc @ezyang @anjali411 @dylanbespalko

@anjali411 anjali411 added good first issue module: complex Related to complex number support in PyTorch labels Apr 3, 2020
@zou3519 zou3519 added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Apr 6, 2020
@Justin-Huber
Copy link
Contributor

I'd like to work on this issue

@anjali411
Copy link
Contributor Author

I'd like to work on this issue

@Justin-Huber thank you :) you can start by looking at this file: https://github.com/pytorch/pytorch/blob/master/torch/csrc/TypeInfo.cpp let me know if you have any questions or need any help and add me as a reviewer when you have a working PR :D

@Justin-Huber
Copy link
Contributor

Sounds good, thanks!

@jdklee
Copy link
jdklee commented Apr 27, 2020

@Justin-Huber Did you get a crack at this? I was wondering maybe we could both work on it?

@Justin-Huber
Copy link
Contributor

@jdklee No I haven't, that sounds good to me! Let me know if you fork it and we can work on that together :)

@anjali411
Copy link
Contributor Author

On more thought, even though numpy supports finfo(floating point info) for complex dtypes, I think we should not support torch.finfo for complex dtypes as complex_tensor.is_floating_point() returns false. Perhaps we can add torch.cinfo for complex dtypes?

as an FYI, this is numpy's current behavior:

>>> np.finfo(np.complex64)
finfo(resolution=1e-06, min=-3.4028235e+38, max=3.4028235e+38, dtype=float32)
>>> np.finfo(np.float32)
finfo(resolution=1e-06, min=-3.4028235e+38, max=3.4028235e+38, dtype=float32)

cc. @mruberry @ezyang

@mruberry
Copy link
Collaborator

I agree that conceptually it would be more elegant for PyTorch (in a bubble) to have finfo and cinfo and finfo doesn't support complex types, but in practice it seems fine to have finfo return complex values since the user is explicitly requesting them (making the risk of user confusion very low) and we greatly prefer being compatible with NumPy where possible. We could also add a cinfo, too, if we wanted, with redundant functionality.

@Justin-Huber
Copy link
Contributor

I'm finally getting around to this and have a few questions. Are there any tests/documentation for finfo? Is the expected behavior the same as numpy's finfo and are the only complex types np.complex_, np.complex128, np.complex64?

@mruberry
Copy link
Collaborator

@Justin-Huber There are existing tests for finfo in test/test_type_info.py and it's documented here https://pytorch.org/docs/master/type_info.html?highlight=finfo#torch.torch.finfo.

I would focus on complex64 and complex128 and yes, the expected behavior is the same as NumPy's. It'd be great to see a PR for this.

@Justin-Huber
Copy link
Contributor

In https://github.com/pytorch/pytorch/blob/master/test/test_type_info.py#L40-L43, what's the reasoning behind constructing a tensor to then get the dtype back from? To be able to get the corresponding numpy dtype? I was looking at #33152 and saw that complex tensors were disabled for the current release, so was wondering if there's another build I should be working from or if just skipping the tensor construction would be fine.

@mruberry
Copy link
Collaborator

There's a lot of historic behavior in this test, and I think you're right they're constructing a NumPy tensor as a method of accessing NumPy's finfo. In torch/testing/_internal/common_utils.py we actually have torch_to_numpy_dtype_dict which will map torch to NumPy dtypes for you.

You should work on the Master branch, which lets you construct complex tensors.

@Justin-Huber
Copy link
Contributor

Here's what I have so far. master...Justin-Huber:master Should the default dtype support complex too?

@mruberry
Copy link
Collaborator

Hey @Justin-Huber, hope you had a relaxing weekend.

No, the default dtype already supports too many tensors. We probably should have restricted it to float32 and float64.

You can open a draft PR with a [WIP] tag on it, too, by the way. That'll make it easy to track the conversation and look at the code together. Add me as a reviewer when you do.

@pytorch pytorch deleted a comment from rannlangel Jun 1, 2020
@pytorch pytorch deleted a comment from rannlangel Jun 1, 2020
@pytorch pytorch deleted a comment from Shubhammishra-21 Jun 1, 2020
@mrshenli mrshenli self-assigned this Sep 1, 2020
@mrshenli
Copy link
Contributor
mrshenli commented Sep 1, 2020

It has been a while since the last update on this issue. I self-assigned to grab this for bootcamp. Let me know if anyone already has a PR for this.

@mruberry
Copy link
Collaborator
mruberry commented Sep 1, 2020

I think we fixed this in #40488 and failed to close the issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue module: complex Related to complex number support in PyTorch triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

No branches or pull requests

6 participants
0