-
-
Notifications
You must be signed in to change notification settings - Fork 10.9k
np.ctypeslib.as_array leaks memory when used on a pointer #6511
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I've observed similar behavior in my application. Without a doubt, there is a memory leak on np.ctypeslib.as_array. Is there any known workaround? |
In my application both the memory address and size remained constant most of the time. This allowed me to work around the bug by saving the previously created array: _cache = None
def array_from_pointer(pnt, shape):
if (_cache is None or
_cache.shape != shape or
_cache.__array_interface__['data'][0] != ctypes.addressof(pnt.contents) ):
_cache = np.ctypeslib.as_array(pnt, shape)
return _cache Note: instead of a single item cache, it is of course possible to do memoization, there's good implementations of memoize decorators floating around on the web. |
I'm not familiar enough with ctypes to make a stab at this myself. A PR would be welcome. |
As a workaround, instead of
how about
|
Yeah, that's exactly what I've been doing. Prevents the leak in my case. |
Using Simply using |
using the import ctypes
import sys
import numpy as np
# create array to work with
N = 100
a = np.arange(N)
# get pointer to array
pnt = np.ctypeslib.as_ctypes(a)
with np.testing.assert_no_gc_cycles():
# create a raw pointer (this is how my real c function works)
newpnt = ctypes.cast(pnt, ctypes.POINTER(ctypes.c_long))
# and construct an array using this data
b = np.ctypeslib.as_array(newpnt, (N,))
# now delete both, which should cleanup both objects
del newpnt, b (currently this test doesn't work at all on master, untion #10970 is merged) Simply changing from
to
makes that leak go away. |
Previously a local `Stream` class would be defined every time a format needed parsing. Classes in cpython create reference cycles, which create load on the GC. This may or may not resolve numpygh-6511
I have a C++ function returning an array, which I convert to a numpy array using np.ctypeslib.as_array(pointer_from_C++_function, (size_of_array,)).
This works as expected, but when I repeatedly call this function (about a million times) I found a substantial increase of memory usage for my python process.
Running a sample script (attached, output attached) through valgrind's memcheck (output attached), it appears that the problem is in ctors.c, which calls the python C API function PyErr_WarnEx. The strange thing is that I never see a warning appear in my python output, so this could also be a bug in python.
For now I will try to work around this problem, but it would be great if this could be fixed.
Details of installation:
Ubuntu 15.04 x86-64
Python: 3.4.3 (installed from ubuntu repo), but problem arises with python 2.7.9 too (also from ubuntu repo)
Numpy 1.10.1 (from pip), but problem also present in numpy 1.8.2 (from ubuntu repo)
Working example:
Output of script
Relevant valgrind output
The text was updated successfully, but these errors were encountered: