8000 MAINT: Add aliases for commonly used `ArrayLike` objects by BvB93 · Pull Request #18050 · numpy/numpy · GitHub
[go: up one dir, main page]

Skip to content

MAINT: Add aliases for commonly used ArrayLike objects #18050

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Dec 22, 2020

Conversation

BvB93
Copy link
Member
@BvB93 BvB93 commented Dec 21, 2020

This PR adds a number of _ArrayLike<x> aliases,
each representing all array-like objects that can be coerced into x (assuming casting="same_kind").

The herein introduced aliases will be necessary for handling dtype-specific overloads of array-likes.

Examples

A simplified mockup example of np.exp (pretending for a second that it's an ordinary python function):

from typing import overload, Any
import numpy as np
import numpy.typing as npt

_FloatArray = np.array[Any, np.dtype[np.floating[Any]]]
_ComplexArray = np.array[Any, np.dtype[np.complexfloating[Any, Any]]]
_ObjectArray = np.array[Any, np.dtype[np.object_]]

@overload
def exp(a: npt._ArrayLikeFloat) -> _FloatArray: ...
@overload
def exp(a: npt._ArrayLikeComplex) -> _ComplexArray: ...
@overload
def exp(a: npt._ArrayLikeObject) -> _ObjectArray: ...

# A union representing array-like objects; consists of two typevars:
# One representing types that can be parametrized w.r.t. `np.dtype`
# and another one for the rest
_ArrayLike = Union[
Copy link
Member Author
@BvB93 BvB93 Dec 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ideally we'd only need a single typevar here so that we can just use, e.g., _ArrayLike[np.float64].

Unfortunately this would require us to somehow express a np.generic via its' builtin counterpart (e.g. expressing float as SuperFancyProtocol[np.float64]), which is simply not possible as of the moment. Not without some plugin magic at least.

@charris
Copy link
Member
charris commented Dec 22, 2020

How can it be determined that some array_like is, say, complex? I am probably not understanding how these will be used.

@BvB93
Copy link
Member Author
BvB93 commented Dec 22, 2020

How can it be determined that some array_like is, say, complex? I am probably not understanding how these will be used.

Right, so there is the pre-existing npt.ArrayLike alias that we currently use for representing arbitrary array-like objects (
nested sequences of either scalars and/or __array__-supporting objects).
The new aliases introduced herein represent various subsets of npt.ArrayLike, e.g. the portion that can be coerced into complex.

@charris
Copy link
Member
charris commented Dec 22, 2020

e.g. the portion that can be coerced into complex.

How is it determined if something can be coerced to complex? Is that something declared by a user?

@BvB93
Copy link
Member Author
BvB93 commented Dec 22, 2020

How is it determined if something can be coerced to complex? Is that something declared by a user?

This is based on the same_kind casting rule, so in the case of complex that would be booleans, integers, floats and complex numbers.

@charris
Copy link
Member
charris commented Dec 22, 2020

That works for scalars, but how does it work for a deeply nested list without examining every element? I guess my question is, how can that be determined except at runtime?

@charris
Copy link
Member
charris commented Dec 22, 2020

Note that I'm perfectly happy to merge this, I just don't understand how it works :)

@BvB93
Copy link
Member Author
BvB93 commented Dec 22, 2020

That works for scalars, but how does it work for a deeply nested list without examining every element?

If this list is created by a function, it would its responsibility to ensure that its type and nesting level are annotated correctly. On the other hand, If the list is defined as is then type checkers can already infer these things.

from typing import TYPE_CHECKING

if TYPE_CHECKING:
    # Revealed type is 'builtins.list[builtins.list*[builtins.list*[builtins.list*[builtins.list*[builtins.float*]]]]]'
    reveal_type([[[[[0.0]]]]])

As a side note, with mypy lacking support for proper recursive objects we are limited to how deep we can go down the recursion rabbit hole.

_ScalarType = TypeVar("_ScalarType")  

# Plan A: nesting level <= 4 (can potentially be made larger or smaller)
_NestedSequence = Union[
    _ScalarType,
    Sequence[_ScalarType],
    Sequence[Sequence[_ScalarType]],
    Sequence[Sequence[Sequence[_ScalarType]]],
    Sequence[Sequence[Sequence[Sequence[_ScalarType]]]],
]

# Plan B: nesting level > 4 (can't infer the scalar type here) 
_RecursiveSequence = Sequence[Sequence[Sequence[Sequence[Sequence[Any]]]]]

@charris charris merged commit 557ed6a into numpy:master Dec 22, 2020
@charris
Copy link
Member
charris commented Dec 22, 2020

Thanks Bas.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants
0