-
-
Notifications
You must be signed in to change notification settings - Fork 10.9k
Increase NPY_MAXARGS to more than 32 #4398
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Where do we sit with this/ |
@seberg @FrancescAlted I have the sense this may have been fixed. Is that the case? |
No, there was no change in ndite regarding this. |
Allows larger numbers of arguments for nditer using functions. This can be helpful for complex boolean expressions used in pytables. Closes numpygh-4398
What's the status regarding this one? I'm hitting the limit quite frequently while building queries dynamically from pandas/numexpr. I read through #4840 but could not find a clear conclusion there. Raising the limit to 256 would be enough for me personally but removing the threshold completely would of course be the best. |
the limit is an issue I hit frequently as well. if it is not possible to remove the limit, are there suggested workarounds? |
Did anything more come of this? |
I have the same problem when using
while Why is there such limitation? Fortunately, it is possible to overcome this in pandas by passing Numpy 1.12.1 |
The limit is way too low in my opinion as well. I use the following hack from it's userdocks sometimes.
|
Sounds like this needs an entry in NumPy 2.0 . |
Good plan (I see you already did, although I guess it isn't technically backward incompatible). I wonder if now with C99 which makes not-fixed width stack arrays a bit more wieldy we could actually think about relaxing it in specific places (such as NpyIter). |
There are still cases where using small fixed sized arrays are advantageous due to speed (they can be allocated on the stack), e.g., inside dispatch for |
At least 9 years later, the issue is still there. |
In trying to find the email list discussion, it seems like numpy-discussion.10968.n7.nabble.com no longer is available (maybe temprary?) and looking at the Feb 2014 mailing list archives (as hinted in the link at the top of the issue) I could not find the discussion. Nor could find a relevant discussion from the link at the top of the closed PR #226 from 2012. |
I am removing the "major release" again. There is no real ABI issue with this (unlike We could still consider just deleting it from the public API, nobody needs this anyway (there is one place where it is used, publicly, but its at the end of a struct). |
gh-25149 increased @seberg can this issue be closed, or did you leave it open on purpose? |
I had left it open, because there is an argument to remove the limit entirely or increase it more (at least for the main iterator). |
gh-28080 will remove any limitation from the iterator. Once that is done, one can look at removing the limitation from ufuncs as well (I am not sure if this issue is about ufuncs or just the iterator originally). |
It is quite frequent for some applications (numexpr, but others too) to hit the NPY_MAXARGS limit. You can find a report about this problem here:
PyTables/PyTables#286
Making this number larger (say 256) would alleviate the issue a lot.
PR #226 tries to tackle the problem, but probably just increasing the value would be enough. There has been a recent discussion in the numpy mailing list too:
http://mail.scipy.org/pipermail/numpy-discussion/2014-February/069266.html
The text was updated successfully, but these errors were encountered: