8000 gh-111926: Make weakrefs thread-safe in free-threaded builds by mpage · Pull Request #117168 · python/cpython · GitHub
[go: up one dir, main page]

Skip to content

gh-111926: Make weakrefs thread-safe in free-threaded builds #117168

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 67 commits into from
Apr 8, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
67 commits
Select commit Hold shift + click to select a range
9bf3820
Add _PyObject_SetMaybeWeakref
mpage Mar 14, 2024
75603c1
Make PyWeakref_NewRef thread-safe
mpage Mar 15, 2024
c4f8453
Make PyWeakref_NewProxy thread-safe
mpage Mar 15, 2024
ece230f
Make weakref___new___ thread-safe
mpage Mar 15, 2024
9acae59
Make weakref_hash thread-safe
mpage Mar 15, 2024
6d00f51
Make _PyStaticType_ClearWeakRefs thread-safe
mpage Mar 15, 2024
363a1a6
Handle multiple weakrefs
mpage Mar 15, 2024
c9b3bf9
Make _PyWeakref_GetWeakrefCount thread-safe
mpage Mar 16, 2024
b7b3315
Make _weakref._getweakrefs thread-safe
mpage Mar 16, 2024
e6e2d28
Make weakref clearing in subtype_dealloc thread-safe
mpage Mar 16, 2024
0d259d1
Fix incorrect size in test_sys for weakref/proxy
mpage Mar 16, 2024
7cc523b
Use a once flag
mpage Mar 18, 2024
11be247
Simplify clearing callback-free refs
mpage Mar 18, 2024
8fcd33f
Rationalize naming
mpage Mar 18, 2024
1e1e609
Fix warning
mpage Mar 18, 2024
19a274e
Simplify ClearWeakRefs
mpage Mar 19, 2024
817a5a3
Refactor clearing code
mpage Mar 19, 2024
d493e7d 8000
Protect wr_object with a separate mutex
mpage Mar 20, 2024
88078a6
Switch to a sharded lock for linked list and object
mpage Mar 21, 2024
dc1f651
Fix formatting
mpage Mar 22, 2024
37c511a
Remove vestigial include
mpage Mar 22, 2024
ccaeded
Make sure we set the cleared flag
mpage Mar 22, 2024
6a50381
Use `Py_TryIncref` when returning the ref
mpage Mar 23, 2024
d998d1b
Don't leak ref if initialization fails
mpage Mar 24, 2024
37076b5
Remove clinic critical section directives for `_weakref` module
mpage Mar 24, 2024
a07d2db
Support async removal of dead weakrefs from WeakValueDictionary
mpage Mar 24, 2024
a901848
Only use the striped lock
mpage Mar 26, 2024
8c85fd3
Refactor weakref creation
mpage Mar 27, 2024
0021bc5
Fixups
mpage Mar 27, 2024
40d99d4
Fix formatting
mpage Mar 27, 2024
dd922b1
Use QSBR to avoid locking in PyType_IsSubtype
mpage Mar 26, 2024
681872e
Remove unnecessary include
mpage Mar 27, 2024
8982831
Fix unused function warning in default builds
mpage Mar 27, 2024
334c766
Check for callable proxy
mpage Mar 28, 2024
0549445
Refactor weakref creation
mpage Mar 29, 2024
eb12c06
Simplify comment
mpage Mar 29, 2024
700cf4c
Get rid of GetWeakrefCountThreadsafe
mpage Mar 29, 2024
63200cd
Don't explicitly zero initialize striped lock
mpage Mar 29, 2024
e9eed72
Indent conditional define
mpage Mar 29, 2024
e3d5779
Use non-atomic loads while we hold the lock
mpage Mar 29, 2024
f31f1e5
Refactor a _PyWeakref_GET_REF and _PyWeakref_IS_DEAD to share more code
mpage Mar 29, 2024
b5c9561
Merge helper functions for clearing weakrefs
mpage Mar 29, 2024
93425e6
Make _PyStaticType_ClearWeakRefs have one implementation for default …
mpage Mar 29, 2024
89edf02
Make sure we're able to index into the striped lock even if GC clears us
mpage Mar 29, 2024
7f73480
Consolidate implementations of _PyWeakref_ClearRef and gc_clear
mpage Mar 29, 2024
d742ed1
Rename `_Py_TryIncref` to `_Py_TryIncrefCompare`
mpage Mar 29, 2024
37f9ceb
Add `_Py_TryIncref`
mpage Mar 29, 2024
fb9eee5
Refactor `weakref._getweakrefs` to have one implementation
mpage Mar 29, 2024
d760b99
Refactor to have one implementation for `PyObject_ClearWeakRefs`
mpage Mar 29, 2024
491e4ff
Revert "Use QSBR to avoid locking in PyType_IsSubtype"
mpage Mar 29, 2024
531a076
Accept thread unsafety when checking subtype for now
mpage Mar 29, 2024
ed280d8
Update modules to use PyWeakref_GetRef
mpage Mar 29, 2024
2750787
Merge branch 'main' into gh-111926-thread-safe-weakref
mpage Mar 29, 2024
5e7c8bb
Use a pointer to the appropriate mutex rather than a copy of the obje…
mpage Mar 29, 2024
37c9529
Fix formatting
mpage Mar 29, 2024
fece88d
Merge implementation of try_reuse_basic_ref
mpage Mar 29, 2024
7a86e35
Remove redundant gc untrack
mpage Apr 2, 2024
f89edd1
Update comment
mpage Apr 2, 2024
5a0b976
Add missing braces
mpage Apr 2, 2024
9e67740
Use Py_ssize_t for tracking number of items
8000 mpage Apr 2, 2024
55a211f
Tighten up clearing basic refs/proxies
mpage Apr 2, 2024
cdb9290
Simplify _PyStaticType_ClearWeakRefs
mpage Apr 2, 2024
c93c336
Simplify get_or_create_weakref
mpage Apr 2, 2024
0366f01
Add _PyWeakref_IsDead
mpage Apr 2, 2024
3e135e4
Remove TODO
mpage Apr 2, 2024
dbf7c58
Fix formatting
mpage Apr 2, 2024
73466bf
Update Python/pystate.c
colesbury Apr 8, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Switch to a sharded lock for linked list and object
  • Loading branch information
mpage committed Mar 22, 2024
commit 88078a686bf0e6ca509d0819289634efdf25da8f
7 changes: 0 additions & 7 deletions Include/cpython/object.h
Original file line number Diff line number Diff line change
Expand Up @@ -465,13 +465,6 @@ PyAPI_FUNC(void) _PyTrash_end(PyThreadState *tstate);
/* Python 3.10 private API, invoked by the Py_TRASHCAN_BEGIN(). */
PyAPI_FUNC(int) _PyTrash_cond(PyObject *op, destructor dealloc);

#ifdef Py_GIL_DISABLED

/* Private API, used by weakrefs in free-threaded builds */
PyAPI_FUNC(int) _PyTrash_contains(PyObject *op);

#endif

#define Py_TRASHCAN_BEGIN_CONDITION(op, cond) \
do { \
PyThreadState *_tstate = NULL; \
Expand Down
15 changes: 11 additions & 4 deletions Include/internal/pycore_critical_section.h
Original file line number Diff line number Diff line change
Expand Up @@ -87,10 +87,12 @@ extern "C" {
#define _Py_CRITICAL_SECTION_MASK 0x3

#ifdef Py_GIL_DISABLED
# define Py_BEGIN_CRITICAL_SECTION(op) \
# define Py_BEGIN_CRITICAL_SECTION_MU(mu) \
{ \
_PyCriticalSection _cs; \
_PyCriticalSection_Begin(&_cs, &_PyObject_CAST(op)->ob_mutex)
_PyCriticalSection_Begin(&_cs, &mu)

# define Py_BEGIN_CRITICAL_SECTION(op) Py_BEGIN_CRITICAL_SECTION_MU(_PyObject_CAST(op)->ob_mutex)

# define Py_END_CRITICAL_SECTION() \
_PyCriticalSection_End(&_cs); \
Expand All @@ -105,10 +107,13 @@ extern "C" {
_PyCriticalSection_XEnd(&_cs_opt); \
}

# define Py_BEGIN_CRITICAL_SECTION2(a, b) \
# define Py_BEGIN_CRITICAL_SECTION2_MU(a, b) \
{ \
_PyCriticalSection2 _cs2; \
_PyCriticalSection2_Begin(&_cs2, &_PyObject_CAST(a)->ob_mutex, &_PyObject_CAST(b)->ob_mutex)
_PyCriticalSection2_Begin(&_cs2, &a, &b)

# define Py_BEGIN_CRITICAL_SECTION2(a, b) \
Py_BEGIN_CRITICAL_SECTION2_MU(_PyObject_CAST(a)->ob_mutex, _PyObject_CAST(b)->ob_mutex)

# define Py_END_CRITICAL_SECTION2() \
_PyCriticalSection2_End(&_cs2); \
Expand Down Expand Up @@ -138,10 +143,12 @@ extern "C" {

#else /* !Py_GIL_DISABLED */
// The critical section APIs are no-ops with the GIL.
# define Py_BEGIN_CRITICAL_SECTION_MU(mu)
# define Py_BEGIN_CRITICAL_SECTION(op)
# define Py_END_CRITICAL_SECTION()
# define Py_XBEGIN_CRITICAL_SECTION(op)
# define Py_XEND_CRITICAL_SECTION()
# define Py_BEGIN_CRITICAL_SECTION2_MU(a, b)
# define Py_BEGIN_CRITICAL_SECTION2(a, b)
# define Py_END_CRITICAL_SECTION2()
# define _Py_CRITICAL_SECTION_ASSERT_MUTEX_LOCKED(mutex)
Expand Down
7 changes: 7 additions & 0 deletions Include/internal/pycore_interp.h
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,12 @@ struct _stoptheworld_state {
PyThreadState *requester; // Thread that requested the pause (may be NULL).
};

#ifdef Py_GIL_DISABLED
// This should be prime but otherwise the choice is arbitrary. A larger value
// increases concurrency at the expense of memory.
#define NUM_WEAKREF_LIST_LOCKS 127
#endif

/* cross-interpreter data registry */

/* Tracks some rare events per-interpreter, used by the optimizer to turn on/off
Expand Down Expand Up @@ -203,6 +209,7 @@ struct _is {
#if defined(Py_GIL_DISABLED)
struct _mimalloc_interp_state mimalloc;
struct _brc_state brc; // biased reference counting state
PyMutex weakref_locks[NUM_WEAKREF_LIST_LOCKS];
#endif

// Per-interpreter state for the obmalloc allocator. For the main
Expand Down
36 changes: 18 additions & 18 deletions Include/internal/pycore_weakref.h
Original file line number Diff line number Diff line change
Expand Up @@ -8,16 +8,25 @@ extern "C" {
# error "this header requires Py_BUILD_CORE define"
#endif

#include "pycore_critical_section.h" // Py_BEGIN_CRITICAL_SECTION()
#include "pycore_lock.h"
#include "pycore_object.h" // _Py_REF_IS_MERGED()

#ifdef Py_GIL_DISABLED

typedef struct _PyWeakRefClearState {
// Protects the weakref's wr_object and wr_callback.
// Protects the cleared flag below.
PyMutex mutex;
_PyOnceFlag once;

// Set if the weakref has been cleared.
int cleared;

Py_ssize_t refcount;
} _PyWeakRefClearState;

#define WEAKREF_LIST_LOCK(obj) _PyInterpreterState_GET()->weakref_locks[((uintptr_t) obj) % NUM_WEAKREF_LIST_LOCKS]

#endif

static inline int _is_dead(PyObject *obj)
Expand All @@ -36,23 +45,12 @@ static inline int _is_dead(PyObject *obj)
#endif
}

#if defined(Py_GIL_DISABLED)
# define LOCK_WR_OBJECT(weakref) PyMutex_Lock(&(weakref)->clear_state->mutex);
# define UNLOCK_WR_OBJECT(weakref) PyMutex_Unlock(&(weakref)->clear_state->mutex);
#else
# define LOCK_WR_OBJECT(weakref)
# define UNLOCK_WR_OBJECT(weakref)
#endif

// NB: In free-threaded builds nothing between the LOCK/UNLOCK calls below can
// suspend.

static inline PyObject* _PyWeakref_GET_REF(PyObject *ref_obj)
{
assert(PyWeakref_Check(ref_obj));
PyObject *ret = NULL;
PyWeakReference *ref = _Py_CAST(PyWeakReference*, ref_obj);
LOCK_WR_OBJECT(ref)
Py_BEGIN_CRITICAL_SECTION_MU(ref->clear_state->mutex);
PyObject *obj = ref->wr_object;

if (obj == Py_None) {
Expand All @@ -68,7 +66,7 @@ static inline PyObject* _PyWeakref_GET_REF(PyObject *ref_obj)
#endif
ret = Py_NewRef(obj);
end:
UNLOCK_WR_OBJECT(ref);
Py_END_CRITICAL_SECTION();
return ret;
}

Expand All @@ -77,7 +75,7 @@ static inline int _PyWeakref_IS_DEAD(PyObject *ref_obj)
assert(PyWeakref_Check(ref_obj));
int ret = 0;
PyWeakReference *ref = _Py_CAST(PyWeakReference*, ref_obj);
LOCK_WR_OBJECT(ref);
Py_BEGIN_CRITICAL_SECTION_MU(ref->clear_state->mutex);
PyObject *obj = ref->wr_object;
if (obj == Py_None) {
// clear_weakref() was called
Expand All @@ -87,14 +85,16 @@ static inline int _PyWeakref_IS_DEAD(PyObject *ref_obj)
// See _PyWeakref_GET_REF() for the rationale of this test
ret = _is_dead(obj);
}
UNLOCK_WR_OBJECT(ref);
Py_END_CRITICAL_SECTION();
return ret;
}

// NB: In free-threaded builds the referenced object must be live for the
// duration of the call.
// NB: In free-threaded builds the weakref list lock for the referenced object
// must be held around calls to this function.
extern Py_ssize_t _PyWeakref_GetWeakrefCount(PyWeakReference *head);

extern Py_ssize_t _PyWeakref_GetWeakrefCountThreadsafe(PyObject *obj);

// Clear all the weak references to obj but leave their callbacks uncalled and
// intact.
extern void _PyWeakref_ClearWeakRefsExceptCallbacks(PyObject *obj);
Expand Down
8 changes: 3 additions & 5 deletions Modules/_weakref.c
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,7 @@ _weakref_getweakrefcount_impl(PyObject *module, PyObject *object)
if (!_PyType_SUPPORTS_WEAKREFS(Py_TYPE(object))) {
return 0;
}
PyWeakReference **list = GET_WEAKREFS_LISTPTR(object);
Py_ssize_t count = _PyWeakref_GetWeakrefCount(*list);
return count;
return _PyWeakref_GetWeakrefCountThreadsafe(object);
}


Expand Down Expand Up @@ -94,7 +92,7 @@ _weakref_getweakrefs_impl(PyObject *module, PyObject *object)
}

PyWeakReference **list = GET_WEAKREFS_LISTPTR(object);
Py_ssize_t count = _PyWeakref_GetWeakrefCount(*list);
Py_ssize_t count = _PyWeakref_GetWeakrefCountThreadsafe(object);

PyObject *result = PyList_New(count);
if (result == NULL) {
Expand All @@ -103,7 +101,7 @@ _weakref_getweakrefs_impl(PyObject *module, PyObject *object)

#ifdef Py_GIL_DISABLED
Py_ssize_t num_added = 0;
Py_BEGIN_CRITICAL_SECTION(object);
Py_BEGIN_CRITICAL_SECTION_MU(WEAKREF_LIST_LOCK(object));
PyWeakReference *current = *list;
// Weakrefs may be added or removed since the count was computed.
while (num_added < count && current != NULL) {
Expand Down
19 changes: 0 additions & 19 deletions Objects/object.c
Original file line number Diff line number Diff line change
Expand Up @@ -2835,25 +2835,6 @@ _PyTrash_cond(PyObject *op, destructor dealloc)
return Py_TYPE(op)->tp_dealloc == dealloc;
}

#ifdef Py_GIL_DISABLED

int
_PyTrash_contains(PyObject *op)
{
PyThreadState *tstate = _PyThreadState_GET();
struct _py_trashcan *trash = _PyTrash_get_state(tstate);
PyObject *cur = trash->delete_later;
while (cur) {
if (cur == op) {
return 1;
}
cur = (PyObject *) cur->ob_tid;
}
return 0;
}

#endif


void _Py_NO_RETURN
_PyObject_AssertFailed(PyObject *obj, const char *expr, const char *msg,
Expand Down
10 changes: 1 addition & 9 deletions Objects/typeobject.c
Original file line number Diff line number Diff line change
Expand Up @@ -2188,15 +2188,7 @@ subtype_dealloc(PyObject *self)
finalizers since they might rely on part of the object
being finalized that has already been destroyed. */
if (type->tp_weaklistoffset && !base->tp_weaklistoffset) {
/* Modeled after GET_WEAKREFS_LISTPTR().

This is never triggered for static types so we can avoid the
(slightly) more costly _PyObject_GET_WEAKREFS_LISTPTR(). */
PyWeakReference **list = \
_PyObject_GET_WEAKREFS_LISTPTR_FROM_OFFSET(self);
while (*list) {
_PyWeakref_ClearRef(*list);
}
_PyWeakref_ClearWeakRefsExceptCallbacks(self);
}
}

Expand Down
Loading
0