8000 OpInfo: `fmod` and `remainder` by krshrimali · Pull Request #57941 · pytorch/pytorch · GitHub
[go: up one dir, main page]

Skip to content

OpInfo: fmod and remainder #57941

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Closed
Changes from 1 commit
Commits
Show all changes
60 commits
Select commit Hold shift + click to select a range
4d1a080
Adding OpInfo for fmod, still some failures
krshrimali May 10, 2021
b40e11f
fmod: skipping variant_eager and jit tests for now, needs debugging
krshrimali May 10, 2021
2daee8f
Merging with upstream
krshrimali May 10, 2021
2a8772f
No extra lines please
krshrimali May 10, 2021
e8637f1
Tests for remainder, check variant tests (TODO)
krshrimali May 10, 2021
d804adc
Reorganize sample inputs, add comments
krshrimali May 11, 2021
9604f48
Scalar tensors
krshrimali May 11, 2021
c22d387
Add include_zero param to make_tensor, merge sample_inputs func
krshrimali May 12, 2021
0bb2482
Add include_zero in make_tensor documentation
krshrimali May 12, 2021
7cf0861
Changing the docs, WIP
krshrimali May 12, 2021
a001581
Merge remote-tracking branch 'upstream/master' into origin/fmod-remai…
krshrimali May 12, 2021
abdfe1f
Merge remote-tracking branch 'upstream/master' into origin/fmod-remai…
krshrimali May 12, 2021
c445b0b
Bring back removed lines of codes, minor fixes
krshrimali May 12, 2021
bd446ab
Trailing whitespace removed
krshrimali May 12, 2021
4e34135
use torch.finfo, add support for complex tensors
krshrimali May 12, 2021
6cf48ac
Merge
krshrimali May 12, 2021
f95cba6
Remove fmod and remainder from method_tests
krshrimali May 12, 2021
a5bccee
Docs update for fmod and remainder, add formula + comparison
krshrimali May 13, 2021
726b1eb
Merge branch 'master' into origin/fmod-remainder-opinfo
krshrimali May 13, 2021
8703bfa
bring torch.rand back
krshrimali May 13, 2021
5d0d9ce
Docs fix for tquot and rquot
krshrimali May 13, 2021
b4c38e4
use tuples instead of lists
krshrimali May 13, 2021
0174dd2
Merge conflict resolved
krshrimali May 13, 2021
70385c9
Fixing docs errors
krshrimali May 13, 2021
42d49d2
Flake8 errors fixes
krshrimali May 13, 2021
e074075
Add complex inputs support, exclude_values arg, skip jit variant test…
krshrimali May 13, 2021
b25de8c
Minor fix in if condition for exclude_values
krshrimali May 13, 2021
125ec77
nit
krshrimali May 13, 2021
5856422
Minor change in the line for complex inputs
krshrimali May 13, 2021
a26f041
Don't use mutable datastructure as default arg
krshrimali May 13, 2021
7684fc6
Merge
krshrimali May 13, 2021
c40c6fc
Attempt to solve mypy error
krshrimali May 13, 2021
92899ba
typo fixed, as per review from Kshitij12345
krshrimali May 13, 2021
17480af
Split OpInfos for autodiffed samples, addressed review
krshrimali May 13, 2021
572082d
Remove add_other, add tensor scalar input, add torch.bool as dtype fo…
krshrimali May 13, 2021
1c26704
Address reviews on doc
krshrimali May 17, 2021
6797f4e
autodiffed from autodiffed_args
krshrimali May 17, 2021
6c54bdf
Minor changes in the doc
krshrimali May 17, 2021
020424b
Merge remote-tracking branch 'upstream/master' into origin/fmod-remai…
krshrimali May 17, 2021
6db6536
Remove THC impl for fmod, unused
krshrimali May 18, 2021
cff930a
bring it back
krshrimali May 18, 2021
70506ff
Minor fix in the docs
krshrimali May 18, 2021
2e59a1b
Split doc, as suggested
krshrimali May 18, 2021
2825087
Minor fix, thanks Kshiteej
krshrimali May 19, 2021
b6ab98c
Fix mypy error, source: mypy docs
krshrimali May 19, 2021
ff00042
mypy: ignore assignment added
krshrimali May 19, 2021
6cb309f
Remove THC definition, unused
krshrimali May 19, 2021
f6bb0b4
Docs update
krshrimali May 21, 2021
174ad54
Bring back exclude_zero, as per review
krshrimali May 21, 2021
8076a57
Merge remote-tracking branch 'upstream/master' into origin/fmod-remai…
krshrimali May 21, 2021
0832347
≈Merge branch 'origin/fmod-remainder-opinfo' of https://github.com/kr…
krshrimali May 21, 2021
e170a35
Use exclude_zero, minor typo fix
krshrimali May 21, 2021
eea02fc
Merging conflicts
krshrimali May 24, 2021
80c8dab
Doc update and add fmod, remainder to works_list
krshrimali May 26, 2021
2c3837d
Merge remote-tracking branch 'upstream/master' into origin/fmod-remai…
krshrimali May 26, 2021
1dea0e8
Add fmod and remainder to works_list
krshrimali May 26, 2021
3ac46f7
Complete merge
krshrimali May 26, 2021
2a99968
Testing
krshrimali May 26, 2021
0f095c9
Add remainder.a 10000 utodiffed and fmod.autodiffed to the list, rollback er…
krshrimali May 26, 2021
c3273f7
minor typo fixed
krshrimali May 26, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Bring back removed lines of codes, minor fixes
  • Loading branch information
krshrimali committed May 12, 2021
commit c445b0ba1e3ca25d86b3d9a9562d56f3b3358a64
132 changes: 86 additions & 46 deletions torch/testing/_internal/common_methods_invocations.py
Original file line number Diff line number Diff line change
Expand Up @@ -2528,38 +2528,6 @@ def sample_inputs_pow(op_info, device, dtype, requires_grad, **kwargs):
requires_grad=requires_grad),)))
return tuple(samples)

def sample_inputs_fmod_remainder(op_info, device, dtype, requires_grad, **kwargs):
make_arg = partial(make_tensor, dtype=dtype, device=device, requires_grad=requires_grad)

cases = [
((S, S, S), (), 1.5, False), # Scalar
((), 1.5, (), False), # Scalar
((S, S, S), (S, S, S), 1.5, False),
]

# Sample inputs with broadcasting
cases_with_broadcasting = [
((S,), (S, S, S), 1.5, True),
((S, S, S), (S,), 1.5, True),
((S, 1, S), (S, S), 1.5, True),
((), (S, S, S), 1.5, True),
((S, S, S), 1.5, (), True) # Scalar
]

cases.extend(cases_with_broadcasting)

def generator():
for shape, shape_other, add_other, broadcasts_input in cases:
if isinstance(shape_other, tuple):
arg = make_arg(shape_other, requires_grad=False, include_zero=False) + add_other
else:
# shape_other is scalar
arg = shape_other
yield(SampleInput(make_arg(shape), args=(arg,),
broadcasts_input=broadcasts_input))

return list(generator())

def sample_inputs_svd(op_info, device, dtype, requires_grad=False, **kwargs):
return _sample_inputs_svd(op_info, device, dtype, requires_grad, is_linalg_svd=False)

Expand Down Expand Up @@ -2692,6 +2660,34 @@ def sample_inputs_fliplr_flipud(op_info, device, dtype, requires_grad, **kwargs)
)
return [SampleInput(tensor) for tensor in tensors]

def sample_inputs_fmod_remainder(op_info, device, dtype, requires_grad, **kwargs):
make_arg = partial(make_tensor, dtype=dtype, device=device, requires_grad=requires_grad)

cases = [((S, S, S), (), 1.5, False),
((), 1.5, (), False), # Scalar
((S, S, S), (S, S, S), 1.5, False)]

# Sample inputs with broadcasting
cases_with_broadcasting = [((S,), (S, S, S), 1.5, True),
((S, S, S), (S,), 1.5, True),
((S, 1, S), (S, S), 1.5, True),
((), (S, S, S), 1.5, True),
((S, S, S), 1.5, (), True)]

cases.extend(cases_with_broadcasting)

def generator():
for shape, shape_other, add_other, broadcasts_input in cases:
if isinstance(shape_other, tuple):
arg = make_arg(shape_other, requires_grad=False, include_zero=False) + add_other
else:
# shape_other is scalar
arg = shape_other
yield(SampleInput(make_arg(shape), args=(arg,),
broadcasts_input=broadcasts_input))

return list(generator())

# TODO: clamp shares tensors among its sample inputs --- we should prohibit this!
def sample_inputs_clamp(op_info, device, dtype, requires_grad, **kwargs):
x = make_tensor((S, M, S), device, dtype, low=None, high=None, requires_grad=requires_grad)
Expand Down Expand Up @@ -6268,6 +6264,22 @@ def method_tests():
('reshape_as', (S, S, S), (non_differentiable(torch.rand(S * S, S)),)),
('reshape_as', (), (non_differentiable(torch.tensor(42.)),), 'scalar'),
('reshape_as', (), (non_differentiable(torch.rand(1, 1)),), 'scalar_to_dims'),
('fmod', (S, S, S), (1.5,), '', (True,)),
('fmod', (), (1.5,), 'scalar', (True,)),
('fmod', (S, S, S), (non_differentiable(torch.rand(S, S, S) + 1.5),), 'tensor'),
('fmod', (S,), (non_differentiable(torch.rand(S, S, S) + 1.5),), 'tensor_broadcast_lhs'),
('fmod', (S, S, S), (non_differentiable(torch.rand(S) + 1.5),), 'tensor_broadcast_rhs'),
('fmod', (S, 1, S), (non_differentiable(torch.rand(S, S) + 1.5),), 'tensor_broadcast_all'),
('fmod', (), (non_differentiable(uniform_scalar(1.5)),), 'scalar_tensor'),
('fmod', (), (non_differentiable(torch.rand(S, S, S) + 1.5),), 'scalar_tensor_broadcast_lhs'),
('fmod', (S, S, S), (non_differentiable(uniform_scalar(1.5)),), 'scalar_tensor_broadcast_rhs'),
('remainder', (S, S, S), (1.5,), '', (True,)),
('remainder', (), (1.5,), 'scalar', (True,)),
('remainder', (S, S, S), (non_differentiable(torch.rand(S, S, S) + 1.5),), 'tensor'),
('remainder', (S,), (non_differentiable(torch.rand(S, S, S) + 1.5),), 'tensor_broadcast_lhs'),
('remainder', (S, 1, S), (non_differentiable(torch.rand(S, S) + 1.5),), 'tensor_broadcast_all'),
('remainder', (), (non_differentiable(uniform_scalar(1.5)),), 'scalar_tensor'),
('remainder', (), (non_differentiable(torch.rand(S, S, S) + 1.5),), 'scalar_tensor_broadcast_lhs'),
('kthvalue', (S, S, S), (2,)),
('kthvalue', (S, S, S), (2, 1,), 'dim', (), [1]),
('kthvalue', (S, S, S), (2, 1, True,), 'keepdim_dim', (), [1]),
Expand Down Expand Up @@ -6606,25 +6618,53 @@ def unpack_variables(args):
return args


EXCLUDE_FUNCTIONAL = {
'addmm',
'addmm_',
'reshape',
'where' # argument order
}
EXCLUDE_GRADCHECK: Dict[str, Any] = {
}
EXCLUDE_GRADGRADCHECK: Dict[str, Any] = {
EXCLUDE_FUNCTIONAL = {
'addmm',
'addmm_',
'reshape',
'where' # argument order
}
EXCLUDE_GRADCHECK: Dict[str, Any] = {
}
EXCLUDE_GRADGRADCHECK: Dict[str, Any] = {
}
EXCLUDE_GRADGRADCHECK_BY_TEST_NAME = {
# `other` expand_as(self, other) is not used in autograd.
'test_expand_as',
'test_cdist',
}


def exclude_tensor_method(name, test_name):
# there are no tensor equivalents for these (inplace or out)
exclude_all_tensor_method_by_test_name = {
'test_slice',
'test_where',
'test_where_broadcast_all',
'test_where_scalar',
'test_where_scalar_broadcast_mask',
'test_where_scalar_broadcast_non_mask',
'test_var_mean_keepdim_dim_1d',
'test_var_mean_keepdim_dim',
'test_var_mean_dim_1d',
'test_var_mean_dim',
'test_var_mean',
'test_std_mean_keepdim_dim_1d',
'test_std_mean_keepdim_dim',
'test_std_mean_dim_1d',
'test_std_mean_dim',
'test_std_mean',
}
EXCLUDE_GRADGRADCHECK_BY_TEST_NAME = {
# `other` expand_as(self, other) is not used in autograd.
'test_expand_as',
'test_cdist',
# there are no out-of-place tensor equivalents for these
exclude_outplace_tensor_method = {
'index_fill',
'scatter',
'scatter_add',
}
if test_name in exclude_all_tensor_method_by_test_name:
return True
is_magic_method = name[:2] == '__' and name[-2:] == '__'
is_inplace = name[-1] == "_" and not is_magic_method
if not is_inplace and name in exclude_outplace_tensor_method:
return True
return False
return False
0