8000 Signature should be extended for `torch.hamming_window()` · Issue #146590 · pytorch/pytorch · GitHub
[go: up one dir, main page]

Skip to content

Signature should be extended for torch.hamming_window() #146590

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
ILCSFNO opened this issue Feb 6, 2025 · 4 comments · May be fixed by #152682
Open

Signature should be extended for torch.hamming_window() #146590

ILCSFNO opened this issue Feb 6, 2025 · 4 comments · May be fixed by #152682
Labels
module: python frontend For issues relating to PyTorch's Python frontend triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Comments

@ILCSFNO
Copy link
Contributor
ILCSFNO commented Feb 6, 2025

🐛 Describe the bug

Seen from #145371, I notice some similar situations in torch.hamming_window():

pytorch/torch/_torch_docs.py

Lines 12380 to 12381 in 4a545eb

hamming_window(window_length, periodic=True, alpha=0.54, beta=0.46, *, dtype=None, \
layout=torch.strided, device=None, requires_grad=False) -> Tensor

Minified Repro

import torch
import random
window_length = random.randint(1, 100)
periodic = True ## choice: True, False
alpha = random.uniform(0, 1)
beta = 1 - alpha
## Below shows 4 possibilities of the combination of arguments
window = torch.hamming_window((window_length + 2), alpha=alpha, beta=beta) # (1) Error for: torch.hamming_window(window_length=int, alpha=float, beta=float)
# window = torch.hamming_window((window_length + 2), alpha=alpha) # (2) Error for: torch.hamming_window(window_length=int, alpha=float)
# window = torch.hamming_window((window_length + 2), beta=beta) # (3) Error for: torch.hamming_window(window_length=int, beta=float)
# window = torch.hamming_window((window_length + 2), periodic=periodic, beta=beta) # (4) Error for: torch.hamming_window(window_length=int, periodic=bool, beta=float)

Output

TypeError: hamming_window() received an invalid combination of arguments - got (int, beta=float, alpha=float), but expected one of:
 * (int window_length, *, torch.dtype dtype = None, torch.layout layout = None, torch.device device = None, bool pin_memory = False, bool requires_grad = False)
 * (int window_length, bool periodic, *, torch.dtype dtype = None, torch.layout layout = None, torch.device device = None, bool pin_memory = False, bool requires_grad = False)
 * (int window_length, bool periodic, float alpha, *, torch.dtype dtype = None, torch.layout layout = None, torch.device device = None, bool pin_memory = False, bool requires_grad = False)
 * (int window_length, bool periodic, float alpha, float beta, *, torch.dtype dtype = None, torch.layout layout = None, torch.device device = None, bool pin_memory = False, bool requires_grad = False)

These four combinations are actually valid, but except.

Suggestions

I suggest that the behavior of unspecified periodic, alpha, beta and specified periodic=True, alpha=0.54, beta=0.46 to be identical. That is, as a fix, the signature

(int window_length, *, torch.dtype dtype = None, torch.layout layout = None, torch.device device = None, bool pin_memory = False, bool requires_grad = False)

ought to be extended with periodic=True, alpha=0.54 and beta=0.46,
with other three signatures been merged to this signature.

Versions

pytorch==2.5.0
torchvision==0.20.0
torchaudio==2.5.0
pytorch-cuda=12.1

cc @albanD

@mikaylagawarecki mikaylagawarecki added the module: python frontend For issues relating to PyTorch's Python frontend label Feb 6, 2025
@mikaylagawarecki
Copy link
Contributor
mikaylagawarecki commented Feb 6, 2025

Is the issue that hamming window actually does not have a default for periodic?

- func: hamming_window(int window_length, *, ScalarType? dtype=None, Layout? layout=None, Device? device=None, bool? pin_memory=None) -> Tensor
dispatch:
CompositeExplicitAutograd: hamming_window
autogen: hamming_window.out
- func: hamming_window.periodic(int window_length, bool periodic, *, ScalarType? dtype=None, Layout? layout=None, Device? device=None, bool? pin_memory=None) -> Tensor
dispatch:
CompositeExplicitAutograd: hamming_window
autogen: hamming_window.periodic_out
- func: hamming_window.periodic_alpha(int window_length, bool periodic, float alpha, *, ScalarType? dtype=None, Layout? layout=None, Device? device=None, bool? pin_memory=None) -> Tensor
dispatch:
CompositeExplicitAutograd: hamming_window
autogen: hamming_window.periodic_alpha_out
- func: hamming_window.periodic_alpha_beta(int window_length, bool periodic, float alpha, float beta, *, ScalarType? dtype=None, Layout? layout=None, Device? device=None, bool? pin_memory=None) -> Tensor
dispatch:
CompositeExplicitAutograd: hamming_window
autogen: hamming_window.periodic_alpha_beta_out

Seems actionable to fix the docs then

@mikaylagawarecki mikaylagawarecki added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Feb 6, 2025
@ILCSFNO
Copy link
Contributor Author
ILCSFNO commented Feb 7, 2025

Maybe it is so in my opinion.

@ILCSFNO
Copy link
Contributor Author
ILCSFNO commented Feb 7, 2025

And the signatures inside may look confusing.

@ILCSFNO
Copy link
Contributor Author
ILCSFNO commented May 2, 2025

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: python frontend For issues relating to PyTorch's Python frontend triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants
0