-
-
Notifications
You must be signed in to change notification settings - Fork 10.9k
API: bump MAXDIMS/MAXARGS to 64 introduce NPY_AXIS_RAVEL #25149
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 1 commit
fc1bb6b
af55c6b
bbc4083
6a878e3
3270231
d6840f5
a5a250f
8dddc54
56b792c
0498d8c
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
Bumping these two has three main caveats: 1. We cannot break ABI for the iterator macros, which means the iterators now need to use 32/their own macro. 2. We used axis=MAXDIMS to mean axis=None, introduce NPY_AXIS_RAVEL to replace this, it will be run-time outside NumPy. 3. The old style iterators cannot deal with high dimensions, meaning that some functions just won't work (because they use them).
- Loading branch information
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -181,11 +181,6 @@ cdef extern from "numpy/arrayobject.h": | |
|
||
NPY_ARRAY_UPDATE_ALL | ||
|
||
cdef enum: | ||
NPY_MAXDIMS | ||
|
||
npy_intp NPY_MAX_ELSIZE | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Maybe a bit random, but There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Does this removal need a release note? Just to say that if someone was using and relying on it, it wasn't doing what they thought it did. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 🤷 I doubt anyone will notice, but sure, added. |
||
|
||
ctypedef void (*PyArray_VectorUnaryFunc)(void *, void *, npy_intp, void *, void *) | ||
|
||
ctypedef struct PyArray_ArrayDescr: | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -183,7 +183,7 @@ PyArray_IntpConverter(PyObject *obj, PyArray_Dims *seq) | |
if (len > NPY_MAXDIMS) { | ||
PyErr_Format(PyExc_ValueError, | ||
"maximum supported dimension for an ndarray " | ||
"is %d, found %d", NPY_MAXDIMS, len); | ||
"is currently %d, found %d", NPY_MAXDIMS, len); | ||
Py_DECREF(seq_obj); | ||
return NPY_FAIL; | ||
} | ||
|
@@ -325,23 +325,14 @@ NPY_NO_EXPORT int | |
PyArray_AxisConverter(PyObject *obj, int *axis) | ||
{ | ||
if (obj == Py_None) { | ||
*axis = NPY_MAXDIMS; | ||
*axis = NPY_RAVEL_AXIS; | ||
} | ||
else { | ||
*axis = PyArray_PyIntAsInt_ErrMsg(obj, | ||
"an integer is required for the axis"); | ||
if (error_converting(*axis)) { | ||
return NPY_FAIL; | ||
} | ||
if (*axis == NPY_MAXDIMS){ | ||
/* NumPy 1.23, 2022-05-19 */ | ||
if (DEPRECATE("Using `axis=32` (MAXDIMS) is deprecated. " | ||
"32/MAXDIMS had the same meaning as `axis=None` which " | ||
"should be used instead. " | ||
"(Deprecated NumPy 1.23)") < 0) { | ||
return NPY_FAIL; | ||
} | ||
} | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is very niche, but, mentioned it in the release notes. |
||
} | ||
return NPY_SUCCEED; | ||
} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this a remnant of an earlier version that completely removed
NPY_MAXDIMS
from the public API? Maybe rephrase this to sayNPY_MAXDIMS
is there but not to rely on it because it might change or be removed in the future.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It wasn't a remnant, but rephrased a bit.