-
-
Notifications
You must be signed in to change notification settings - Fork 12k
Description
Describe the issue:
I have an int64 array, mesh tuple (as constructed by ix_) and a boolean array.
The shape of the integer array indexed by the mesh is the same as the boolean array.
Then we want to add 1 to all places where the boolean array.
So:
>>> int_array[mesh][bool_array]
array([3])
>>> int_array[mesh][bool_array] += 1
>>> int_array[mesh][bool_array]
array([3])
In the example, I have tried multiple ways and the update seems inconsistent to me.
Reproduce the code example:
import numpy as np
int_array = np.array(
[
[1, 8, 12, 9],
[3, 5, 10, 12],
[3, 8, 8, 11],
[0, 5, 14, 11],
[0, 6, 6, 18],
[0, 4, 12, 14],
[2, 5, 8, 15],
[3, 9, 7, 11],
[2, 8, 7, 13],
[3, 9, 8, 10],
[2, 8, 9, 11],
[0, 0, 0, 0],
[3, 8, 8, 11],
[0, 0, 0, 0],
[2, 9, 8, 11],
[0, 0, 0, 0],
[2, 8, 9, 11],
[3, 6, 12, 9],
[0, 0, 0, 0],
[3, 7, 7, 13],
[3, 8, 8, 11],
[0, 0, 0, 0],
[0, 0, 0, 0],
[3, 7, 10, 10],
[2, 8, 8, 12],
[2, 8, 8, 12],
[4, 8, 7, 11],
[2, 6, 7, 15],
[0, 0, 0, 0],
[3, 7, 9, 11],
[0, 0, 0, 0],
[0, 0, 0, 0],
[2, 9, 7, 12],
[0, 0, 0, 0],
[0, 8, 8, 14],
[4, 8, 7, 11],
[2, 7, 7, 14],
[3, 8, 7, 12],
[0, 0, 0, 0],
[2, 7, 10, 11],
[3, 8, 8, 11],
[0, 0, 0, 0],
[3, 8, 8, 11],
[3, 9, 7, 11],
[0, 1, 10, 19],
[0, 0, 0, 0],
[4, 9, 7, 10],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[3, 8, 8, 11],
[3, 7, 8, 12],
[3, 7, 7, 13],
[3, 7, 7, 13],
[2, 6, 8, 14],
[3, 6, 10, 11],
[0, 0, 0, 0],
[3, 8, 7, 12],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[3, 6, 9, 12],
[0, 0, 0, 0],
[4, 7, 8, 11],
[3, 7, 10, 10],
[0, 0, 0, 0],
[3, 10, 7, 10],
[2, 6, 6, 16],
[0, 0, 0, 0],
[2, 7, 8, 13],
[0, 0, 0, 0],
[3, 9, 7, 11],
[0, 0, 0, 0],
[0, 8, 8, 14],
[0, 0, 0, 0],
[0, 3, 5, 22],
[3, 6, 9, 12],
[0, 0, 0, 0],
[4, 7, 8, 11],
[3, 8, 8, 11],
[0, 0, 0, 0],
[3, 7, 8, 12],
[3, 9, 8, 10],
[0, 0, 0, 0],
[0, 0, 0, 0],
[4, 8, 7, 11],
[0, 4, 9, 17],
[0, 0, 0, 0],
[3, 6, 8, 13],
[3, 8, 9, 10],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[2, 6, 7, 15],
[3, 9, 7, 11],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[2, 8, 8, 12],
[4, 7, 7, 12],
[1, 7, 9, 13],
[2, 6, 7, 15],
[3, 8, 8, 11],
[0, 0, 0, 0],
[2, 7, 9, 12],
[0, 0, 0, 0],
[2, 6, 11, 11],
[0, 0, 0, 0],
[0, 0, 0, 0],
[3, 7, 7, 12],
[4, 8, 8, 10],
[3, 8, 8, 11],
[0, 0, 0, 0],
[0, 0, 0, 0],
[3, 8, 7, 12],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[3, 8, 7, 12],
[2, 7, 6, 15],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[3, 8, 7, 12],
[0, 4, 7, 19],
[2, 6, 8, 14],
[2, 8, 8, 12],
[0, 0, 0, 0],
[3, 7, 9, 11],
[0, 0, 0, 0],
[0, 7, 11, 12],
[0, 0, 0, 0],
[3, 8, 7, 12],
[0, 0, 0, 0],
[2, 8, 12, 8],
[0, 0, 0, 0],
[0, 0, 0, 0],
[2, 11, 9, 8],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[4, 8, 7, 11],
[2, 8, 7, 13],
[3, 8, 8, 11],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[3, 10, 7, 10],
[2, 8, 10, 10],
[2, 6, 9, 13],
[4, 8, 6, 12],
[2, 9, 7, 12],
[2, 5, 7, 16],
[0, 0, 0, 0],
[0, 5, 14, 11],
[3, 8, 8, 11],
[0, 0, 0, 0],
[0, 0, 0, 0],
[1, 8, 11, 10],
[2, 8, 10, 10],
[2, 7, 6, 15],
[0, 0, 0, 0],
[3, 8, 7, 12],
[0, 0, 0, 0],
[0, 6, 7, 17],
[4, 11, 6, 9],
[3, 6, 9, 12],
[3, 7, 9, 11],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 4, 8, 18],
[1, 7, 13, 9],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 4, 8, 18],
[0, 0, 0, 0],
[3, 8, 5, 14],
[3, 8, 7, 12],
[0, 0, 0, 0],
[0, 3, 3, 24],
[3, 7, 9, 11],
[2, 7, 13, 8],
[0, 0, 0, 0],
[0, 0, 0, 0],
[3, 6, 7, 14],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[4, 8, 7, 11],
[4, 8, 7, 11],
[0, 0, 0, 0],
[4, 1, 2, 23],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
[0, 0, 0, 0],
],
dtype=np.int64,
)
bool_array = np.array(
[
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[True, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
[False, False, False, False],
],
dtype=bool,
)
mesh = (
np.array(
[
[0],
[1],
[2],
[3],
[4],
[5],
[6],
[7],
[8],
[9],
[10],
[12],
[14],
[16],
[17],
[19],
[20],
[23],
[24],
[25],
[26],
[27],
[29],
[32],
[34],
[35],
[36],
[37],
[39],
[40],
[42],
[43],
[44],
[46],
[51],
[52],
[53],
[54],
[55],
[56],
[58],
[62],
[64],
[65],
[67],
[68],
[70],
[72],
[74],
[76],
[77],
[80],
[82],
[83],
[86],
[87],
[89],
[90],
[95],
[96],
[101],
[102],
[103],
[104],
[105],
[107],
[109],
[112],
[113],
[114],
[117],
[122],
[123],
[127],
[128],
[129],
[130],
[132],
[134],
[136],
[138],
[141],
[147],
[148],
[149],
[154],
[155],
[156],
[157],
[158],
[159],
[161],
[162],
[165],
[166],
[167],
[169],
[171],
[172],
[173],
[174],
[177],
[178],
[181],
[183],
[184],
[186],
[187],
[188],
[196],
[197],
[199],
[79],
[191],
[0],
[1],
[2],
[3],
[4],
[5],
[6],
[7],
[8],
[9],
[10],
[12],
[14],
[16],
[17],
[19],
[20],
[23],
[24],
[25],
[26],
[27],
[29],
[32],
[34],
[35],
[36],
[37],
[39],
[40],
[42],
[43],
[44],
[46],
[51],
[52],
[53],
[54],
[55],
[56],
[58],
[62],
[64],
[65],
[67],
[68],
[70],
[72],
[74],
[76],
[77],
[79],
[80],
[82],
[83],
[86],
[87],
[89],
[90],
[95],
[96],
[101],
[102],
[103],
[104],
[105],
[107],
[109],
[112],
[113],
[114],
[117],
[122],
[123],
[127],
[128],
[129],
[130],
[132],
[134],
[136],
[138],
[141],
[147],
[148],
[149],
[154],
[155],
[156],
[157],
[158],
[159],
[161],
[162],
[165],
[166],
[167],
[169],
[171],
[172],
[173],
[174],
[177],
[178],
[181],
[183],
[184],
[186],
[187],
[188],
[191],
[196],
[197],
[199],
],
dtype=int,
),
np.array([[0, 1, 2, 3]], dtype=int),
)
print(int_array.shape, int_array.dtype)
print(bool_array.shape, bool_array.dtype)
print(mesh[0].shape, mesh[0].dtype)
print(mesh[1].shape, mesh[1].dtype)
# Does not work
print("double indexing inplace addition")
print(int_array[mesh][bool_array])
int_array[mesh][bool_array] += 1
print(int_array[mesh][bool_array])
print("double indexing overwrite")
print(int_array[mesh][bool_array])
int_array[mesh][bool_array] = 10
print(int_array[mesh][bool_array])
print("Addition with array of same shape")
print(int_array[mesh][bool_array])
int_array[mesh] += np.where(bool_array, 1, 0)
print(int_array[mesh][bool_array])
bool_array[0, 0] = True
# Does also not work
print("Changing the boolean array")
print(int_array[mesh][bool_array])
int_array[mesh][bool_array] = 0
print(int_array[mesh][bool_array])
# Works
print("Adding a (broadcasted) array works")
print(int_array[:5])
int_array[mesh] += np.array([100, 200, 300, 400])
print(int_array[:5])Error message:
not applicablePython and NumPy Versions:
I tried multiple versions with the same result:
2.4.1
3.13.9 (main, Nov 19 2025, 22:47:49) [Clang 21.1.4 ]
2.3.0
3.13.9 (main, Nov 19 2025, 22:47:49) [Clang 21.1.4 ]
2.2.0
3.13.9 (main, Nov 19 2025, 22:47:49) [Clang 21.1.4 ]
Runtime Environment:
2.4.1:
[{'numpy_version': '2.4.1',
'python': '3.13.9 (main, Nov 19 2025, 22:47:49) [Clang 21.1.4 ]',
'uname': uname_result(system='Linux', node='rolfDebianlt', release='6.12.63+deb13-amd64', version='#1 SMP PREEMPT_DYNAMIC Debian 6.12.63-1 (2025-12-30)', machine='x86_64')},
{'simd_extensions': {'baseline': ['X86_V2'],
'found': ['X86_V3'],
'not_found': ['X86_V4', 'AVX512_ICL', 'AVX512_SPR']}},
{'ignore_floating_point_errors_in_matmul': False},
{'architecture': 'Haswell',
'filepath': '/home/rolf/thunderstock/test/.venv/lib/python3.13/site-packages/numpy.libs/libscipy_openblas64_-fdde5778.so',
'internal_api': 'openblas',
'num_threads': 20,
'prefix': 'libscipy_openblas',
'threading_layer': 'pthreads',
'user_api': 'blas',
'version': '0.3.30'}]
2.3.0:
[{'numpy_version': '2.3.0',
'python': '3.13.9 (main, Nov 19 2025, 22:47:49) [Clang 21.1.4 ]',
'uname': uname_result(system='Linux', node='rolfDebianlt', release='6.12.63+deb13-amd64', version='#1 SMP PREEMPT_DYNAMIC Debian 6.12.63-1 (2025-12-30)', machine='x86_64')},
{'simd_extensions': {'baseline': ['SSE', 'SSE2', 'SSE3'],
6549
'found': ['SSSE3',
'SSE41',
'POPCNT',
'SSE42',
'AVX',
'F16C',
'FMA3',
'AVX2'],
'not_found': ['AVX512F',
'AVX512CD',
'AVX512_KNL',
'AVX512_KNM',
'AVX512_SKX',
'AVX512_CLX',
'AVX512_CNL',
'AVX512_ICL',
'AVX512_SPR']}},
{'architecture': 'Haswell',
'filepath': '/home/rolf/thunderstock/test/.venv/lib/python3.13/site-packages/numpy.libs/libscipy_openblas64_-56d6093b.so',
'internal_api': 'openblas',
'num_threads': 20,
'prefix': 'libscipy_openblas',
'threading_layer': 'pthreads',
'user_api': 'blas',
'version': '0.3.29'}]
2.2.0:
[{'numpy_version': '2.2.0',
'python': '3.13.9 (main, Nov 19 2025, 22:47:49) [Clang 21.1.4 ]',
'uname': uname_result(system='Linux', node='rolfDebianlt', release='6.12.63+deb13-amd64', version='#1 SMP PREEMPT_DYNAMIC Debian 6.12.63-1 (2025-12-30)', machine='x86_64')},
{'simd_extensions': {'baseline': ['SSE', 'SSE2', 'SSE3'],
'found': ['SSSE3',
'SSE41',
'POPCNT',
'SSE42',
'AVX',
'F16C',
'FMA3',
'AVX2'],
'not_found': ['AVX512F',
'AVX512CD',
'AVX512_KNL',
'AVX512_KNM',
'AVX512_SKX',
'AVX512_CLX',
'AVX512_CNL',
'AVX512_ICL']}},
{'architecture': 'Haswell',
'filepath': '/home/rolf/thunderstock/test/.venv/lib/python3.13/site-packages/numpy.libs/libscipy_openblas64_-6bb31eeb.so',
'internal_api': 'openblas',
'num_threads': 20,
'prefix': 'libscipy_openblas',
'threading_layer': 'pthreads',
'user_api': 'blas',
'version': '0.3.28'}]
How does this issue affect you or how did you find it:
The code does work for my colleague on his MacBook, but fails on my debian linux system, and also seems to fail on our cloud (AWS Lambda x86).
I found it because our lambda was timing out, and testing locally I could find an infinite loop.
The above operation was assumed to work, and it does work as expected till we found this specific example.
My colleague could not find the bug locally, as it was not reproducable on his machine, it was reproducable on mine.
So in this case we had an impact, but for now an easy work around is in place.
But the above structure might also be in place in other calculations, which do not create infinite loops, but gives wrong calculations.
So I cannot for sure say how much it does impact us right now.