From 914ffb40d0d3ced2ed11e8f0edbc4dc4c569ccf6 Mon Sep 17 00:00:00 2001 From: Pablo Galindo Date: Mon, 2 Oct 2023 17:23:15 +0100 Subject: [PATCH 001/265] Post 3.11.6 --- Include/patchlevel.h | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Include/patchlevel.h b/Include/patchlevel.h index 70d71c11377ede..58a3297d17b872 100644 --- a/Include/patchlevel.h +++ b/Include/patchlevel.h @@ -23,7 +23,7 @@ #define PY_RELEASE_SERIAL 0 /* Version as a string */ -#define PY_VERSION "3.11.6" +#define PY_VERSION "3.11.6+" /*--end constants--*/ /* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2. From 981696e342f41dc3f8c2ea94a79475c931d2c7cf Mon Sep 17 00:00:00 2001 From: Victor Stinner Date: Mon, 2 Oct 2023 21:43:05 +0200 Subject: [PATCH 002/265] [3.11] gh-108494: Document how to add a project in PCbuild/readme.txt (#110077) (#110232) gh-108494: Document how to add a project in PCbuild/readme.txt (#110077) (cherry picked from commit 6387b5313c60c1403785b2245db33372476ac304) --- PCbuild/readme.txt | 28 ++++++++++++++++++++++++++++ 1 file changed, 28 insertions(+) diff --git a/PCbuild/readme.txt b/PCbuild/readme.txt index 092b2cc8c1a7f7..d6149a3cedeaa3 100644 --- a/PCbuild/readme.txt +++ b/PCbuild/readme.txt @@ -291,3 +291,31 @@ project, with some projects overriding certain specific values. The GUI doesn't always reflect the correct settings and may confuse the user with false information, especially for settings that automatically adapt for different configurations. + +Add a new project +----------------- + +For example, add a new _testclinic_limited project to build a new +_testclinic_limited extension, the file Modules/_testclinic_limited.c: + +* In PCbuild/, copy _testclinic.vcxproj to _testclinic_limited.vcxproj, + replace RootNamespace value with `_testclinic_limited`, replace + `_asyncio.c` with `_testclinic_limited.c`. +* Open Visual Studio, open PCbuild\pcbuild.sln solution, add the + PCbuild\_testclinic_limited.vcxproj project to the solution ("add existing + project). +* Add a dependency on the python project to the new _testclinic_limited + project. +* Save and exit Visual Studio. +* Add `;_testclinic_limited` to `` in + PCbuild\pcbuild.proj. +* Update "exts" in Tools\msi\lib\lib_files.wxs file or in + Tools\msi\test\test_files.wxs file (for tests). +* PC\layout\main.py needs updating if you add a test-only extension whose name + doesn't start with "_test". +* Add the extension to PCbuild\readme.txt (this file). +* Build Python from scratch (clean the solution) to check that the new project + is built successfully. +* Ensure the new .vcxproj and .vcxproj.filters files are added to your commit, + as well as the changes to pcbuild.sln, pcbuild.proj and any other modified + files. From bf6843e91f70a9956acb8a1cdd0564d3598b9a6e Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 2 Oct 2023 18:36:55 -0700 Subject: [PATCH 003/265] [3.11] [3.12] gh-109649: Enhance os.cpu_count() documentation (GH-110169) (#110226) [3.12] gh-109649: Enhance os.cpu_count() documentation (GH-110169) * gh-109649: Enhance os.cpu_count() documentation * Doc: Specify that os.cpu_count() counts *logicial* CPUs. * Doc: Specify that os.sched_getaffinity(0) is related to the calling thread. * Fix test_posix.test_sched_getaffinity(): restore the old CPU mask when the test completes! * Restore removed text (cherry picked from commit 5245b97e132ae071e2b574224e0788cab62fdcc9) Co-authored-by: Victor Stinner --- Doc/library/os.rst | 16 +++++++++------- Lib/test/test_posix.py | 1 + 2 files changed, 10 insertions(+), 7 deletions(-) diff --git a/Doc/library/os.rst b/Doc/library/os.rst index df01963e54a5e6..087a01d2d9efc1 100644 --- a/Doc/library/os.rst +++ b/Doc/library/os.rst @@ -4872,8 +4872,10 @@ operating system. .. function:: sched_getaffinity(pid, /) - Return the set of CPUs the process with PID *pid* (or the current process - if zero) is restricted to. + Return the set of CPUs the process with PID *pid* is restricted to. + + If *pid* is zero, return the set of CPUs the calling thread of the current + process is restricted to. .. _os-path: @@ -4914,12 +4916,12 @@ Miscellaneous System Information .. function:: cpu_count() - Return the number of CPUs in the system. Returns ``None`` if undetermined. - - This number is not equivalent to the number of CPUs the current process can - use. The number of usable CPUs can be obtained with - ``len(os.sched_getaffinity(0))`` + Return the number of logical CPUs in the system. Returns ``None`` if + undetermined. + This number is not equivalent to the number of logical CPUs the current + process can use. ``len(os.sched_getaffinity(0))`` gets the number of logical + CPUs the calling thread of the current process is restricted to .. versionadded:: 3.4 diff --git a/Lib/test/test_posix.py b/Lib/test/test_posix.py index e643d8e5a4ce63..e4956b55a07e0a 100644 --- a/Lib/test/test_posix.py +++ b/Lib/test/test_posix.py @@ -1205,6 +1205,7 @@ def test_sched_getaffinity(self): @requires_sched_affinity def test_sched_setaffinity(self): mask = posix.sched_getaffinity(0) + self.addCleanup(posix.sched_setaffinity, 0, list(mask)) if len(mask) > 1: # Empty masks are forbidden mask.pop() From 0c6ae623e5c0f2afb28ccc6a1b39b70097d27553 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 2 Oct 2023 22:36:35 -0700 Subject: [PATCH 004/265] [3.11] Remove unused Misc/requirements-test.txt (GH-110240) (#110254) Co-authored-by: Hugo van Kemenade --- Misc/requirements-test.txt | 1 - 1 file changed, 1 deletion(-) delete mode 100644 Misc/requirements-test.txt diff --git a/Misc/requirements-test.txt b/Misc/requirements-test.txt deleted file mode 100644 index 60e7ed20a3d510..00000000000000 --- a/Misc/requirements-test.txt +++ /dev/null @@ -1 +0,0 @@ -tzdata==2020.3 From a151afe7199553415c8c4ea6406b0d393afb119b Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 3 Oct 2023 06:18:08 -0700 Subject: [PATCH 005/265] [3.11] Bump various dependencies in `Doc/requirements-oldest-sphinx.txt` (GH-110278) (#110281) Bump various dependencies in `Doc/requirements-oldest-sphinx.txt` (GH-110278) This resolves a Dependabot security alert on the repository for urllib3==2.0.4. (cherry picked from commit f1663a492e14c80c30cb9741fdc36fa221d5e30a) Co-authored-by: Alex Waygood --- Doc/requirements-oldest-sphinx.txt | 14 +++++++------- 1 file changed, 7 insertions(+), 7 deletions(-) diff --git a/Doc/requirements-oldest-sphinx.txt b/Doc/requirements-oldest-sphinx.txt index d3ef5bc17650ae..5de739fc10b085 100644 --- a/Doc/requirements-oldest-sphinx.txt +++ b/Doc/requirements-oldest-sphinx.txt @@ -13,16 +13,16 @@ python-docs-theme>=2022.1 # Sphinx 4.2 comes from ``needs_sphinx = '4.2'`` in ``Doc/conf.py``. alabaster==0.7.13 -Babel==2.12.1 +Babel==2.13.0 certifi==2023.7.22 -charset-normalizer==3.2.0 +charset-normalizer==3.3.0 colorama==0.4.6 -docutils==0.16 +docutils==0.17.1 idna==3.4 imagesize==1.4.1 -Jinja2==2.11.3 -MarkupSafe==1.1.1 -packaging==23.1 +Jinja2==3.1.2 +MarkupSafe==2.1.3 +packaging==23.2 Pygments==2.16.1 requests==2.31.0 snowballstemmer==2.2.0 @@ -33,4 +33,4 @@ sphinxcontrib-htmlhelp==2.0.1 sphinxcontrib-jsmath==1.0.1 sphinxcontrib-qthelp==1.0.3 sphinxcontrib-serializinghtml==1.1.5 -urllib3==2.0.4 +urllib3==2.0.6 From 3079aa3f5d4f54a05b9b67cb60625e45581f7ade Mon Sep 17 00:00:00 2001 From: Alex Waygood Date: Tue, 3 Oct 2023 16:09:06 +0100 Subject: [PATCH 006/265] [3.11] Enable ruff on `Lib/test/test_typing.py` (#110179) (#110290) --- .pre-commit-config.yaml | 2 +- Lib/test/.ruff.toml | 1 - Lib/test/test_typing.py | 80 ++++++++++++++++++++--------------------- 3 files changed, 41 insertions(+), 42 deletions(-) diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 980b616559ff32..3cff5658ba4e81 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -1,6 +1,6 @@ repos: - repo: https://github.com/astral-sh/ruff-pre-commit - rev: v0.0.288 + rev: v0.0.292 hooks: - id: ruff name: Run Ruff on Lib/test/ diff --git a/Lib/test/.ruff.toml b/Lib/test/.ruff.toml index 0afa0fe36b0ce4..bd46f98512d6e2 100644 --- a/Lib/test/.ruff.toml +++ b/Lib/test/.ruff.toml @@ -24,7 +24,6 @@ extend-exclude = [ "test_pkg.py", "test_subclassinit.py", "test_tokenize.py", - "test_typing.py", "test_yield_from.py", "time_hashlib.py", ] diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index 149c29283d4bae..177155a6365662 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -183,7 +183,7 @@ def test_cannot_subclass(self): class A(self.bottom_type): pass with self.assertRaises(TypeError): - class A(type(self.bottom_type)): + class B(type(self.bottom_type)): pass def test_cannot_instantiate(self): @@ -279,7 +279,7 @@ def test_cannot_subclass(self): class C(type(Self)): pass with self.assertRaises(TypeError): - class C(Self): + class D(Self): pass def test_cannot_init(self): @@ -335,7 +335,7 @@ def test_cannot_subclass(self): class C(type(LiteralString)): pass with self.assertRaises(TypeError): - class C(LiteralString): + class D(LiteralString): pass def test_cannot_init(self): @@ -457,7 +457,7 @@ class V(TypeVar('T')): def test_cannot_subclass_var_itself(self): with self.assertRaises(TypeError): - class V(TypeVar): + class W(TypeVar): pass def test_cannot_instantiate_vars(self): @@ -1185,20 +1185,20 @@ class C(TypeVarTuple): pass def test_cannot_subclass_instance(self): Ts = TypeVarTuple('Ts') with self.assertRaises(TypeError): - class C(Ts): pass + class D(Ts): pass with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): - class C(type(Unpack)): pass + class E(type(Unpack)): pass with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): - class C(type(*Ts)): pass + class F(type(*Ts)): pass with self.assertRaisesRegex(TypeError, CANNOT_SUBCLASS_TYPE): - class C(type(Unpack[Ts])): pass + class G(type(Unpack[Ts])): pass with self.assertRaisesRegex(TypeError, r'Cannot subclass typing\.Unpack'): - class C(Unpack): pass + class H(Unpack): pass with self.assertRaisesRegex(TypeError, r'Cannot subclass \*Ts'): - class C(*Ts): pass + class I(*Ts): pass with self.assertRaisesRegex(TypeError, r'Cannot subclass \*Ts'): - class C(Unpack[Ts]): pass + class J(Unpack[Ts]): pass def test_variadic_class_args_are_correct(self): T = TypeVar('T') @@ -1372,12 +1372,12 @@ def test_variadic_class_with_duplicate_typevartuples_fails(self): with self.assertRaises(TypeError): class C(Generic[*Ts1, *Ts1]): pass with self.assertRaises(TypeError): - class C(Generic[Unpack[Ts1], Unpack[Ts1]]): pass + class D(Generic[Unpack[Ts1], Unpack[Ts1]]): pass with self.assertRaises(TypeError): - class C(Generic[*Ts1, *Ts2, *Ts1]): pass + class E(Generic[*Ts1, *Ts2, *Ts1]): pass with self.assertRaises(TypeError): - class C(Generic[Unpack[Ts1], Unpack[Ts2], Unpack[Ts1]]): pass + class F(Generic[Unpack[Ts1], Unpack[Ts2], Unpack[Ts1]]): pass def test_type_concatenation_in_variadic_class_argument_list_succeeds(self): Ts = TypeVarTuple('Ts') @@ -1744,10 +1744,10 @@ def test_cannot_subclass(self): class C(Union): pass with self.assertRaises(TypeError): - class C(type(Union)): + class D(type(Union)): pass with self.assertRaises(TypeError): - class C(Union[int, str]): + class E(Union[int, str]): pass def test_cannot_instantiate(self): @@ -2454,10 +2454,10 @@ class BP(Protocol): pass class P(C, Protocol): pass with self.assertRaises(TypeError): - class P(Protocol, C): + class Q(Protocol, C): pass with self.assertRaises(TypeError): - class P(BP, C, Protocol): + class R(BP, C, Protocol): pass class D(BP, C): pass @@ -2722,7 +2722,7 @@ class NotAProtocolButAnImplicitSubclass3: meth: Callable[[], None] meth2: Callable[[int, str], bool] def meth(self): pass - def meth(self, x, y): return True + def meth2(self, x, y): return True self.assertNotIsSubclass(AnnotatedButNotAProtocol, CallableMembersProto) self.assertIsSubclass(NotAProtocolButAnImplicitSubclass, CallableMembersProto) @@ -3208,11 +3208,11 @@ def test_protocols_bad_subscripts(self): with self.assertRaises(TypeError): class P(Protocol[T, T]): pass with self.assertRaises(TypeError): - class P(Protocol[int]): pass + class Q(Protocol[int]): pass with self.assertRaises(TypeError): - class P(Protocol[T], Protocol[S]): pass + class R(Protocol[T], Protocol[S]): pass with self.assertRaises(TypeError): - class P(typing.Mapping[T, S], Protocol[T]): pass + class S(typing.Mapping[T, S], Protocol[T]): pass def test_generic_protocols_repr(self): T = TypeVar('T') @@ -3553,12 +3553,12 @@ class NewGeneric(Generic): ... with self.assertRaises(TypeError): class MyGeneric(Generic[T], Generic[S]): ... with self.assertRaises(TypeError): - class MyGeneric(List[T], Generic[S]): ... + class MyGeneric2(List[T], Generic[S]): ... with self.assertRaises(TypeError): Generic[()] - class C(Generic[T]): pass + class D(Generic[T]): pass with self.assertRaises(TypeError): - C[()] + D[()] def test_init(self): T = TypeVar('T') @@ -4234,7 +4234,7 @@ class Test(Generic[T], Final): class Subclass(Test): pass with self.assertRaises(FinalException): - class Subclass(Test[int]): + class Subclass2(Test[int]): pass def test_nested(self): @@ -4472,7 +4472,7 @@ def test_cannot_subclass(self): class C(type(ClassVar)): pass with self.assertRaises(TypeError): - class C(type(ClassVar[int])): + class D(type(ClassVar[int])): pass def test_cannot_init(self): @@ -4514,7 +4514,7 @@ def test_cannot_subclass(self): class C(type(Final)): pass with self.assertRaises(TypeError): - class C(type(Final[int])): + class D(type(Final[int])): pass def test_cannot_init(self): @@ -6514,15 +6514,15 @@ class A: class X(NamedTuple, A): x: int with self.assertRaises(TypeError): - class X(NamedTuple, tuple): + class Y(NamedTuple, tuple): x: int with self.assertRaises(TypeError): - class X(NamedTuple, NamedTuple): + class Z(NamedTuple, NamedTuple): x: int - class A(NamedTuple): + class B(NamedTuple): x: int with self.assertRaises(TypeError): - class X(NamedTuple, A): + class C(NamedTuple, B): y: str def test_generic(self): @@ -7108,13 +7108,13 @@ def test_cannot_subclass(self): class C(type(Required)): pass with self.assertRaises(TypeError): - class C(type(Required[int])): + class D(type(Required[int])): pass with self.assertRaises(TypeError): - class C(Required): + class E(Required): pass with self.assertRaises(TypeError): - class C(Required[int]): + class F(Required[int]): pass def test_cannot_init(self): @@ -7154,13 +7154,13 @@ def test_cannot_subclass(self): class C(type(NotRequired)): pass with self.assertRaises(TypeError): - class C(type(NotRequired[int])): + class D(type(NotRequired[int])): pass with self.assertRaises(TypeError): - class C(NotRequired): + class E(NotRequired): pass with self.assertRaises(TypeError): - class C(NotRequired[int]): + class F(NotRequired[int]): pass def test_cannot_init(self): @@ -7622,7 +7622,7 @@ class C(TypeAlias): pass with self.assertRaises(TypeError): - class C(type(TypeAlias)): + class D(type(TypeAlias)): pass def test_repr(self): @@ -8078,7 +8078,7 @@ def test_cannot_subclass(self): class C(type(TypeGuard)): pass with self.assertRaises(TypeError): - class C(type(TypeGuard[int])): + class D(type(TypeGuard[int])): pass def test_cannot_init(self): From 6a6e8871ba0dcf4332093e198a876957651768ed Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 3 Oct 2023 09:02:04 -0700 Subject: [PATCH 007/265] [3.11] gh-109234: Hint to contextlib.closing in sqlite3 context manager docs (GH-109322) (#110294) (cherry picked from commit 4227bfa8b273207a2b882f7d69c8ac49c3d2b57d) Co-authored-by: Lincoln <71312724+Lincoln-developer@users.noreply.github.com> Co-authored-by: blurb-it[bot] <43283697+blurb-it[bot]@users.noreply.github.com> Co-authored-by: Erlend E. Aasland --- Doc/library/sqlite3.rst | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/Doc/library/sqlite3.rst b/Doc/library/sqlite3.rst index f3bcfb5a28f9a3..c85f4fc49c070e 100644 --- a/Doc/library/sqlite3.rst +++ b/Doc/library/sqlite3.rst @@ -2195,9 +2195,9 @@ If there is no open transaction upon leaving the body of the ``with`` statement, the context manager is a no-op. .. note:: - The context manager neither implicitly opens a new transaction - nor closes the connection. + nor closes the connection. If you need a closing context manager, consider + using :meth:`contextlib.closing`. .. testcode:: From 4e44927a0e80e85f68f26dcba06702022bd9f884 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 3 Oct 2023 15:27:56 -0700 Subject: [PATCH 008/265] [3.11] gh-109917: Fix test instability in test_concurrent_futures (GH-110306) (#110316) gh-109917: Fix test instability in test_concurrent_futures (GH-110306) The test had an instability issue due to the ordering of the dummy queue operation and the real wakeup pipe operations. Both primitives are thread safe but not done atomically as a single update and may interleave arbitrarily. With the old order of operations this can lead to an incorrect state where the dummy queue is full but the wakeup pipe is empty. By swapping the order in clear() I think this can no longer happen in any possible operation interleaving (famous last words). (cherry picked from commit a376a72bd92cd7c9930467dd1aba40045fb75e3b) Co-authored-by: elfstrom --- Lib/test/test_concurrent_futures/test_deadlock.py | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/Lib/test/test_concurrent_futures/test_deadlock.py b/Lib/test/test_concurrent_futures/test_deadlock.py index 7ede95d5ed3422..1db4cd009926b9 100644 --- a/Lib/test/test_concurrent_futures/test_deadlock.py +++ b/Lib/test/test_concurrent_futures/test_deadlock.py @@ -284,11 +284,12 @@ def wakeup(self): super().wakeup() def clear(self): + super().clear() try: while True: self._dummy_queue.get_nowait() except queue.Empty: - super().clear() + pass with (unittest.mock.patch.object(futures.process._ExecutorManagerThread, 'run', mock_run), From 5039db7f9b2213634cd053a7f4a1e6000edf8e2f Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 4 Oct 2023 02:01:19 -0700 Subject: [PATCH 009/265] [3.11] gh-110267: Add tests for pickling and copying PyStructSequence objects (GH-110272) (GH-110284) (cherry picked from commit 2d4865d775123e8889c7a79fc49b4bf627176c4b) Co-authored-by: Xuehai Pan --- Lib/test/test_structseq.py | 75 ++++++++++++++++++- ...-10-03-10-54-09.gh-issue-110267.O-c47G.rst | 2 + 2 files changed, 75 insertions(+), 2 deletions(-) create mode 100644 Misc/NEWS.d/next/Tests/2023-10-03-10-54-09.gh-issue-110267.O-c47G.rst diff --git a/Lib/test/test_structseq.py b/Lib/test/test_structseq.py index a9fe193028ebe4..c6c0afaf077acc 100644 --- a/Lib/test/test_structseq.py +++ b/Lib/test/test_structseq.py @@ -1,4 +1,6 @@ +import copy import os +import pickle import time import unittest @@ -106,9 +108,78 @@ def __len__(self): self.assertRaises(Exc, time.struct_time, C()) - def test_reduce(self): + def test_pickling(self): t = time.gmtime() - x = t.__reduce__() + for proto in range(pickle.HIGHEST_PROTOCOL + 1): + p = pickle.dumps(t, proto) + t2 = pickle.loads(p) + self.assertEqual(t2.__class__, t.__class__) + self.assertEqual(t2, t) + self.assertEqual(t2.tm_year, t.tm_year) + self.assertEqual(t2.tm_zone, t.tm_zone) + + def test_pickling_with_unnamed_fields(self): + assert os.stat_result.n_unnamed_fields > 0 + + r = os.stat_result(range(os.stat_result.n_sequence_fields), + {'st_atime': 1.0, 'st_atime_ns': 2.0}) + for proto in range(pickle.HIGHEST_PROTOCOL + 1): + p = pickle.dumps(r, proto) + r2 = pickle.loads(p) + self.assertEqual(r2.__class__, r.__class__) + self.assertEqual(r2, r) + self.assertEqual(r2.st_mode, r.st_mode) + self.assertEqual(r2.st_atime, r.st_atime) + self.assertEqual(r2.st_atime_ns, r.st_atime_ns) + + def test_copying(self): + n_fields = time.struct_time.n_fields + t = time.struct_time([[i] for i in range(n_fields)]) + + t2 = copy.copy(t) + self.assertEqual(t2.__class__, t.__class__) + self.assertEqual(t2, t) + self.assertEqual(t2.tm_year, t.tm_year) + self.assertEqual(t2.tm_zone, t.tm_zone) + self.assertIs(t2[0], t[0]) + self.assertIs(t2.tm_year, t.tm_year) + + t3 = copy.deepcopy(t) + self.assertEqual(t3.__class__, t.__class__) + self.assertEqual(t3, t) + self.assertEqual(t3.tm_year, t.tm_year) + self.assertEqual(t3.tm_zone, t.tm_zone) + self.assertIsNot(t3[0], t[0]) + self.assertIsNot(t3.tm_year, t.tm_year) + + def test_copying_with_unnamed_fields(self): + assert os.stat_result.n_unnamed_fields > 0 + + n_sequence_fields = os.stat_result.n_sequence_fields + r = os.stat_result([[i] for i in range(n_sequence_fields)], + {'st_atime': [1.0], 'st_atime_ns': [2.0]}) + + r2 = copy.copy(r) + self.assertEqual(r2.__class__, r.__class__) + self.assertEqual(r2, r) + self.assertEqual(r2.st_mode, r.st_mode) + self.assertEqual(r2.st_atime, r.st_atime) + self.assertEqual(r2.st_atime_ns, r.st_atime_ns) + self.assertIs(r2[0], r[0]) + self.assertIs(r2.st_mode, r.st_mode) + self.assertIs(r2.st_atime, r.st_atime) + self.assertIs(r2.st_atime_ns, r.st_atime_ns) + + r3 = copy.deepcopy(r) + self.assertEqual(r3.__class__, r.__class__) + self.assertEqual(r3, r) + self.assertEqual(r3.st_mode, r.st_mode) + self.assertEqual(r3.st_atime, r.st_atime) + self.assertEqual(r3.st_atime_ns, r.st_atime_ns) + self.assertIsNot(r3[0], r[0]) + self.assertIsNot(r3.st_mode, r.st_mode) + self.assertIsNot(r3.st_atime, r.st_atime) + self.assertIsNot(r3.st_atime_ns, r.st_atime_ns) def test_extended_getslice(self): # Test extended slicing by comparing with list slicing. diff --git a/Misc/NEWS.d/next/Tests/2023-10-03-10-54-09.gh-issue-110267.O-c47G.rst b/Misc/NEWS.d/next/Tests/2023-10-03-10-54-09.gh-issue-110267.O-c47G.rst new file mode 100644 index 00000000000000..2bae7715cc3d5b --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-10-03-10-54-09.gh-issue-110267.O-c47G.rst @@ -0,0 +1,2 @@ +Add tests for pickling and copying PyStructSequence objects. +Patched by Xuehai Pan. From 6c98c734c7bcef2eee870bf4e5b4326bb81b320c Mon Sep 17 00:00:00 2001 From: Victor Stinner Date: Wed, 4 Oct 2023 12:53:28 +0200 Subject: [PATCH 010/265] =?UTF-8?q?[3.11]=20gh-109972:=20Split=20test=5Fgd?= =?UTF-8?q?b.py=20into=20test=5Fgdb=20package=20(#109977)=20(=E2=80=A6=20(?= =?UTF-8?q?#110343)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit [3.12] gh-109972: Split test_gdb.py into test_gdb package (#109977) (#110339) gh-109972: Split test_gdb.py into test_gdb package (#109977) Split test_gdb.py file into a test_gdb package made of multiple tests, so tests can now be run in parallel. * Create Lib/test/test_gdb/ directory. * Split test_gdb.py into multiple files in Lib/test/test_gdb/ directory. * Move Lib/test/gdb_sample.py to Lib/test/test_gdb/ directory. Update get_sample_script(): use __file__ to locate gdb_sample.py. * Move gdb_has_frame_select() and HAS_PYUP_PYDOWN to test_misc.py. * Explicitly skip test_gdb on Windows. Previously, test_gdb was skipped even if gdb was available because of gdb_has_frame_select(). (cherry picked from commit 8f324b7ecd2df3036fab098c4c8ac185ac07b277) (cherry picked from commit e7a61d34b7897ac6cff53add2ec03309a5ff8350) --- Lib/test/libregrtest/runtest.py | 1 + Lib/test/test_gdb.py | 1066 ----------------- Lib/test/test_gdb/__init__.py | 10 + Lib/test/{ => test_gdb}/gdb_sample.py | 2 +- Lib/test/test_gdb/test_backtrace.py | 134 +++ Lib/test/test_gdb/test_cfunction.py | 83 ++ Lib/test/test_gdb/test_misc.py | 188 +++ Lib/test/test_gdb/test_pretty_print.py | 400 +++++++ Lib/test/test_gdb/util.py | 304 +++++ Makefile.pre.in | 1 + ...-09-28-12-25-19.gh-issue-109972.GYnwIP.rst | 2 + 11 files changed, 1124 insertions(+), 1067 deletions(-) delete mode 100644 Lib/test/test_gdb.py create mode 100644 Lib/test/test_gdb/__init__.py rename Lib/test/{ => test_gdb}/gdb_sample.py (75%) create mode 100644 Lib/test/test_gdb/test_backtrace.py create mode 100644 Lib/test/test_gdb/test_cfunction.py create mode 100644 Lib/test/test_gdb/test_misc.py create mode 100644 Lib/test/test_gdb/test_pretty_print.py create mode 100644 Lib/test/test_gdb/util.py create mode 100644 Misc/NEWS.d/next/Tests/2023-09-28-12-25-19.gh-issue-109972.GYnwIP.rst diff --git a/Lib/test/libregrtest/runtest.py b/Lib/test/libregrtest/runtest.py index 000a4aa75c44a0..f37093b81cf284 100644 --- a/Lib/test/libregrtest/runtest.py +++ b/Lib/test/libregrtest/runtest.py @@ -143,6 +143,7 @@ def set_env_changed(self): "test_asyncio", "test_concurrent_futures", "test_future_stmt", + "test_gdb", "test_multiprocessing_fork", "test_multiprocessing_forkserver", "test_multiprocessing_spawn", diff --git a/Lib/test/test_gdb.py b/Lib/test/test_gdb.py deleted file mode 100644 index 5d042a45587a58..00000000000000 --- a/Lib/test/test_gdb.py +++ /dev/null @@ -1,1066 +0,0 @@ -# Verify that gdb can pretty-print the various PyObject* types -# -# The code for testing gdb was adapted from similar work in Unladen Swallow's -# Lib/test/test_jit_gdb.py - -import os -import platform -import re -import subprocess -import sys -import sysconfig -import textwrap -import unittest - -from test import support -from test.support import findfile, python_is_optimized - -def get_gdb_version(): - try: - cmd = ["gdb", "-nx", "--version"] - proc = subprocess.Popen(cmd, - stdout=subprocess.PIPE, - stderr=subprocess.PIPE, - universal_newlines=True) - with proc: - version, stderr = proc.communicate() - - if proc.returncode: - raise Exception(f"Command {' '.join(cmd)!r} failed " - f"with exit code {proc.returncode}: " - f"stdout={version!r} stderr={stderr!r}") - except OSError: - # This is what "no gdb" looks like. There may, however, be other - # errors that manifest this way too. - raise unittest.SkipTest("Couldn't find gdb on the path") - - # Regex to parse: - # 'GNU gdb (GDB; SUSE Linux Enterprise 12) 7.7\n' -> 7.7 - # 'GNU gdb (GDB) Fedora 7.9.1-17.fc22\n' -> 7.9 - # 'GNU gdb 6.1.1 [FreeBSD]\n' -> 6.1 - # 'GNU gdb (GDB) Fedora (7.5.1-37.fc18)\n' -> 7.5 - # 'HP gdb 6.7 for HP Itanium (32 or 64 bit) and target HP-UX 11iv2 and 11iv3.\n' -> 6.7 - match = re.search(r"^(?:GNU|HP) gdb.*?\b(\d+)\.(\d+)", version) - if match is None: - raise Exception("unable to parse GDB version: %r" % version) - return (version, int(match.group(1)), int(match.group(2))) - -gdb_version, gdb_major_version, gdb_minor_version = get_gdb_version() -if gdb_major_version < 7: - raise unittest.SkipTest("gdb versions before 7.0 didn't support python " - "embedding. Saw %s.%s:\n%s" - % (gdb_major_version, gdb_minor_version, - gdb_version)) - -if not sysconfig.is_python_build(): - raise unittest.SkipTest("test_gdb only works on source builds at the moment.") - -if ((sysconfig.get_config_var('PGO_PROF_USE_FLAG') or 'xxx') in - (sysconfig.get_config_var('PY_CORE_CFLAGS') or '')): - raise unittest.SkipTest("test_gdb is not reliable on PGO builds") - -# Location of custom hooks file in a repository checkout. -checkout_hook_path = os.path.join(os.path.dirname(sys.executable), - 'python-gdb.py') - -PYTHONHASHSEED = '123' - - -def cet_protection(): - cflags = sysconfig.get_config_var('CFLAGS') - if not cflags: - return False - flags = cflags.split() - # True if "-mcet -fcf-protection" options are found, but false - # if "-fcf-protection=none" or "-fcf-protection=return" is found. - return (('-mcet' in flags) - and any((flag.startswith('-fcf-protection') - and not flag.endswith(("=none", "=return"))) - for flag in flags)) - -# Control-flow enforcement technology -CET_PROTECTION = cet_protection() - - -def run_gdb(*args, **env_vars): - """Runs gdb in --batch mode with the additional arguments given by *args. - - Returns its (stdout, stderr) decoded from utf-8 using the replace handler. - """ - if env_vars: - env = os.environ.copy() - env.update(env_vars) - else: - env = None - # -nx: Do not execute commands from any .gdbinit initialization files - # (issue #22188) - base_cmd = ('gdb', '--batch', '-nx') - if (gdb_major_version, gdb_minor_version) >= (7, 4): - base_cmd += ('-iex', 'add-auto-load-safe-path ' + checkout_hook_path) - proc = subprocess.Popen(base_cmd + args, - # Redirect stdin to prevent GDB from messing with - # the terminal settings - stdin=subprocess.PIPE, - stdout=subprocess.PIPE, - stderr=subprocess.PIPE, - env=env) - with proc: - out, err = proc.communicate() - return out.decode('utf-8', 'replace'), err.decode('utf-8', 'replace') - -# Verify that "gdb" was built with the embedded python support enabled: -gdbpy_version, _ = run_gdb("--eval-command=python import sys; print(sys.version_info)") -if not gdbpy_version: - raise unittest.SkipTest("gdb not built with embedded python support") - -if "major=2" in gdbpy_version: - raise unittest.SkipTest("gdb built with Python 2") - -# Verify that "gdb" can load our custom hooks, as OS security settings may -# disallow this without a customized .gdbinit. -_, gdbpy_errors = run_gdb('--args', sys.executable) -if "auto-loading has been declined" in gdbpy_errors: - msg = "gdb security settings prevent use of custom hooks: " - raise unittest.SkipTest(msg + gdbpy_errors.rstrip()) - -def gdb_has_frame_select(): - # Does this build of gdb have gdb.Frame.select ? - stdout, _ = run_gdb("--eval-command=python print(dir(gdb.Frame))") - m = re.match(r'.*\[(.*)\].*', stdout) - if not m: - raise unittest.SkipTest("Unable to parse output from gdb.Frame.select test") - gdb_frame_dir = m.group(1).split(', ') - return "'select'" in gdb_frame_dir - -HAS_PYUP_PYDOWN = gdb_has_frame_select() - -BREAKPOINT_FN='builtin_id' - -@unittest.skipIf(support.PGO, "not useful for PGO") -class DebuggerTests(unittest.TestCase): - - """Test that the debugger can debug Python.""" - - def get_stack_trace(self, source=None, script=None, - breakpoint=BREAKPOINT_FN, - cmds_after_breakpoint=None, - import_site=False, - ignore_stderr=False): - ''' - Run 'python -c SOURCE' under gdb with a breakpoint. - - Support injecting commands after the breakpoint is reached - - Returns the stdout from gdb - - cmds_after_breakpoint: if provided, a list of strings: gdb commands - ''' - # We use "set breakpoint pending yes" to avoid blocking with a: - # Function "foo" not defined. - # Make breakpoint pending on future shared library load? (y or [n]) - # error, which typically happens python is dynamically linked (the - # breakpoints of interest are to be found in the shared library) - # When this happens, we still get: - # Function "textiowrapper_write" not defined. - # emitted to stderr each time, alas. - - # Initially I had "--eval-command=continue" here, but removed it to - # avoid repeated print breakpoints when traversing hierarchical data - # structures - - # Generate a list of commands in gdb's language: - commands = ['set breakpoint pending yes', - 'break %s' % breakpoint, - - # The tests assume that the first frame of printed - # backtrace will not contain program counter, - # that is however not guaranteed by gdb - # therefore we need to use 'set print address off' to - # make sure the counter is not there. For example: - # #0 in PyObject_Print ... - # is assumed, but sometimes this can be e.g. - # #0 0x00003fffb7dd1798 in PyObject_Print ... - 'set print address off', - - 'run'] - - # GDB as of 7.4 onwards can distinguish between the - # value of a variable at entry vs current value: - # http://sourceware.org/gdb/onlinedocs/gdb/Variables.html - # which leads to the selftests failing with errors like this: - # AssertionError: 'v@entry=()' != '()' - # Disable this: - if (gdb_major_version, gdb_minor_version) >= (7, 4): - commands += ['set print entry-values no'] - - if cmds_after_breakpoint: - if CET_PROTECTION: - # bpo-32962: When Python is compiled with -mcet - # -fcf-protection, function arguments are unusable before - # running the first instruction of the function entry point. - # The 'next' command makes the required first step. - commands += ['next'] - commands += cmds_after_breakpoint - else: - commands += ['backtrace'] - - # print commands - - # Use "commands" to generate the arguments with which to invoke "gdb": - args = ['--eval-command=%s' % cmd for cmd in commands] - args += ["--args", - sys.executable] - args.extend(subprocess._args_from_interpreter_flags()) - - if not import_site: - # -S suppresses the default 'import site' - args += ["-S"] - - if source: - args += ["-c", source] - elif script: - args += [script] - - # Use "args" to invoke gdb, capturing stdout, stderr: - out, err = run_gdb(*args, PYTHONHASHSEED=PYTHONHASHSEED) - - if not ignore_stderr: - for line in err.splitlines(): - print(line, file=sys.stderr) - - # bpo-34007: Sometimes some versions of the shared libraries that - # are part of the traceback are compiled in optimised mode and the - # Program Counter (PC) is not present, not allowing gdb to walk the - # frames back. When this happens, the Python bindings of gdb raise - # an exception, making the test impossible to succeed. - if "PC not saved" in err: - raise unittest.SkipTest("gdb cannot walk the frame object" - " because the Program Counter is" - " not present") - - # bpo-40019: Skip the test if gdb failed to read debug information - # because the Python binary is optimized. - for pattern in ( - '(frame information optimized out)', - 'Unable to read information on python frame', - # gh-91960: On Python built with "clang -Og", gdb gets - # "frame=" for _PyEval_EvalFrameDefault() parameter - '(unable to read python frame information)', - # gh-104736: On Python built with "clang -Og" on ppc64le, - # "py-bt" displays a truncated or not traceback, but "where" - # logs this error message: - 'Backtrace stopped: frame did not save the PC', - # gh-104736: When "bt" command displays something like: - # "#1 0x0000000000000000 in ?? ()", the traceback is likely - # truncated or wrong. - ' ?? ()', - ): - if pattern in out: - raise unittest.SkipTest(f"{pattern!r} found in gdb output") - - return out - - def get_gdb_repr(self, source, - cmds_after_breakpoint=None, - import_site=False): - # Given an input python source representation of data, - # run "python -c'id(DATA)'" under gdb with a breakpoint on - # builtin_id and scrape out gdb's representation of the "op" - # parameter, and verify that the gdb displays the same string - # - # Verify that the gdb displays the expected string - # - # For a nested structure, the first time we hit the breakpoint will - # give us the top-level structure - - # NOTE: avoid decoding too much of the traceback as some - # undecodable characters may lurk there in optimized mode - # (issue #19743). - cmds_after_breakpoint = cmds_after_breakpoint or ["backtrace 1"] - gdb_output = self.get_stack_trace(source, breakpoint=BREAKPOINT_FN, - cmds_after_breakpoint=cmds_after_breakpoint, - import_site=import_site) - # gdb can insert additional '\n' and space characters in various places - # in its output, depending on the width of the terminal it's connected - # to (using its "wrap_here" function) - m = re.search( - # Match '#0 builtin_id(self=..., v=...)' - r'#0\s+builtin_id\s+\(self\=.*,\s+v=\s*(.*?)?\)' - # Match ' at Python/bltinmodule.c'. - # bpo-38239: builtin_id() is defined in Python/bltinmodule.c, - # but accept any "Directory\file.c" to support Link Time - # Optimization (LTO). - r'\s+at\s+\S*[A-Za-z]+/[A-Za-z0-9_-]+\.c', - gdb_output, re.DOTALL) - if not m: - self.fail('Unexpected gdb output: %r\n%s' % (gdb_output, gdb_output)) - return m.group(1), gdb_output - - def assertEndsWith(self, actual, exp_end): - '''Ensure that the given "actual" string ends with "exp_end"''' - self.assertTrue(actual.endswith(exp_end), - msg='%r did not end with %r' % (actual, exp_end)) - - def assertMultilineMatches(self, actual, pattern): - m = re.match(pattern, actual, re.DOTALL) - if not m: - self.fail(msg='%r did not match %r' % (actual, pattern)) - - def get_sample_script(self): - return findfile('gdb_sample.py') - -class PrettyPrintTests(DebuggerTests): - def test_getting_backtrace(self): - gdb_output = self.get_stack_trace('id(42)') - self.assertTrue(BREAKPOINT_FN in gdb_output) - - def assertGdbRepr(self, val, exp_repr=None): - # Ensure that gdb's rendering of the value in a debugged process - # matches repr(value) in this process: - gdb_repr, gdb_output = self.get_gdb_repr('id(' + ascii(val) + ')') - if not exp_repr: - exp_repr = repr(val) - self.assertEqual(gdb_repr, exp_repr, - ('%r did not equal expected %r; full output was:\n%s' - % (gdb_repr, exp_repr, gdb_output))) - - @support.requires_resource('cpu') - def test_int(self): - 'Verify the pretty-printing of various int values' - self.assertGdbRepr(42) - self.assertGdbRepr(0) - self.assertGdbRepr(-7) - self.assertGdbRepr(1000000000000) - self.assertGdbRepr(-1000000000000000) - - def test_singletons(self): - 'Verify the pretty-printing of True, False and None' - self.assertGdbRepr(True) - self.assertGdbRepr(False) - self.assertGdbRepr(None) - - def test_dicts(self): - 'Verify the pretty-printing of dictionaries' - self.assertGdbRepr({}) - self.assertGdbRepr({'foo': 'bar'}, "{'foo': 'bar'}") - # Python preserves insertion order since 3.6 - self.assertGdbRepr({'foo': 'bar', 'douglas': 42}, "{'foo': 'bar', 'douglas': 42}") - - def test_lists(self): - 'Verify the pretty-printing of lists' - self.assertGdbRepr([]) - self.assertGdbRepr(list(range(5))) - - @support.requires_resource('cpu') - def test_bytes(self): - 'Verify the pretty-printing of bytes' - self.assertGdbRepr(b'') - self.assertGdbRepr(b'And now for something hopefully the same') - self.assertGdbRepr(b'string with embedded NUL here \0 and then some more text') - self.assertGdbRepr(b'this is a tab:\t' - b' this is a slash-N:\n' - b' this is a slash-R:\r' - ) - - self.assertGdbRepr(b'this is byte 255:\xff and byte 128:\x80') - - self.assertGdbRepr(bytes([b for b in range(255)])) - - @support.requires_resource('cpu') - def test_strings(self): - 'Verify the pretty-printing of unicode strings' - # We cannot simply call locale.getpreferredencoding() here, - # as GDB might have been linked against a different version - # of Python with a different encoding and coercion policy - # with respect to PEP 538 and PEP 540. - out, err = run_gdb( - '--eval-command', - 'python import locale; print(locale.getpreferredencoding())') - - encoding = out.rstrip() - if err or not encoding: - raise RuntimeError( - f'unable to determine the preferred encoding ' - f'of embedded Python in GDB: {err}') - - def check_repr(text): - try: - text.encode(encoding) - except UnicodeEncodeError: - self.assertGdbRepr(text, ascii(text)) - else: - self.assertGdbRepr(text) - - self.assertGdbRepr('') - self.assertGdbRepr('And now for something hopefully the same') - self.assertGdbRepr('string with embedded NUL here \0 and then some more text') - - # Test printing a single character: - # U+2620 SKULL AND CROSSBONES - check_repr('\u2620') - - # Test printing a Japanese unicode string - # (I believe this reads "mojibake", using 3 characters from the CJK - # Unified Ideographs area, followed by U+3051 HIRAGANA LETTER KE) - check_repr('\u6587\u5b57\u5316\u3051') - - # Test a character outside the BMP: - # U+1D121 MUSICAL SYMBOL C CLEF - # This is: - # UTF-8: 0xF0 0x9D 0x84 0xA1 - # UTF-16: 0xD834 0xDD21 - check_repr(chr(0x1D121)) - - def test_tuples(self): - 'Verify the pretty-printing of tuples' - self.assertGdbRepr(tuple(), '()') - self.assertGdbRepr((1,), '(1,)') - self.assertGdbRepr(('foo', 'bar', 'baz')) - - @support.requires_resource('cpu') - def test_sets(self): - 'Verify the pretty-printing of sets' - if (gdb_major_version, gdb_minor_version) < (7, 3): - self.skipTest("pretty-printing of sets needs gdb 7.3 or later") - self.assertGdbRepr(set(), "set()") - self.assertGdbRepr(set(['a']), "{'a'}") - # PYTHONHASHSEED is need to get the exact frozenset item order - if not sys.flags.ignore_environment: - self.assertGdbRepr(set(['a', 'b']), "{'a', 'b'}") - self.assertGdbRepr(set([4, 5, 6]), "{4, 5, 6}") - - # Ensure that we handle sets containing the "dummy" key value, - # which happens on deletion: - gdb_repr, gdb_output = self.get_gdb_repr('''s = set(['a','b']) -s.remove('a') -id(s)''') - self.assertEqual(gdb_repr, "{'b'}") - - @support.requires_resource('cpu') - def test_frozensets(self): - 'Verify the pretty-printing of frozensets' - if (gdb_major_version, gdb_minor_version) < (7, 3): - self.skipTest("pretty-printing of frozensets needs gdb 7.3 or later") - self.assertGdbRepr(frozenset(), "frozenset()") - self.assertGdbRepr(frozenset(['a']), "frozenset({'a'})") - # PYTHONHASHSEED is need to get the exact frozenset item order - if not sys.flags.ignore_environment: - self.assertGdbRepr(frozenset(['a', 'b']), "frozenset({'a', 'b'})") - self.assertGdbRepr(frozenset([4, 5, 6]), "frozenset({4, 5, 6})") - - def test_exceptions(self): - # Test a RuntimeError - gdb_repr, gdb_output = self.get_gdb_repr(''' -try: - raise RuntimeError("I am an error") -except RuntimeError as e: - id(e) -''') - self.assertEqual(gdb_repr, - "RuntimeError('I am an error',)") - - - # Test division by zero: - gdb_repr, gdb_output = self.get_gdb_repr(''' -try: - a = 1 / 0 -except ZeroDivisionError as e: - id(e) -''') - self.assertEqual(gdb_repr, - "ZeroDivisionError('division by zero',)") - - def test_modern_class(self): - 'Verify the pretty-printing of new-style class instances' - gdb_repr, gdb_output = self.get_gdb_repr(''' -class Foo: - pass -foo = Foo() -foo.an_int = 42 -id(foo)''') - m = re.match(r'', gdb_repr) - self.assertTrue(m, - msg='Unexpected new-style class rendering %r' % gdb_repr) - - def test_subclassing_list(self): - 'Verify the pretty-printing of an instance of a list subclass' - gdb_repr, gdb_output = self.get_gdb_repr(''' -class Foo(list): - pass -foo = Foo() -foo += [1, 2, 3] -foo.an_int = 42 -id(foo)''') - m = re.match(r'', gdb_repr) - - self.assertTrue(m, - msg='Unexpected new-style class rendering %r' % gdb_repr) - - def test_subclassing_tuple(self): - 'Verify the pretty-printing of an instance of a tuple subclass' - # This should exercise the negative tp_dictoffset code in the - # new-style class support - gdb_repr, gdb_output = self.get_gdb_repr(''' -class Foo(tuple): - pass -foo = Foo((1, 2, 3)) -foo.an_int = 42 -id(foo)''') - m = re.match(r'', gdb_repr) - - self.assertTrue(m, - msg='Unexpected new-style class rendering %r' % gdb_repr) - - def assertSane(self, source, corruption, exprepr=None): - '''Run Python under gdb, corrupting variables in the inferior process - immediately before taking a backtrace. - - Verify that the variable's representation is the expected failsafe - representation''' - if corruption: - cmds_after_breakpoint=[corruption, 'backtrace'] - else: - cmds_after_breakpoint=['backtrace'] - - gdb_repr, gdb_output = \ - self.get_gdb_repr(source, - cmds_after_breakpoint=cmds_after_breakpoint) - if exprepr: - if gdb_repr == exprepr: - # gdb managed to print the value in spite of the corruption; - # this is good (see http://bugs.python.org/issue8330) - return - - # Match anything for the type name; 0xDEADBEEF could point to - # something arbitrary (see http://bugs.python.org/issue8330) - pattern = '<.* at remote 0x-?[0-9a-f]+>' - - m = re.match(pattern, gdb_repr) - if not m: - self.fail('Unexpected gdb representation: %r\n%s' % \ - (gdb_repr, gdb_output)) - - def test_NULL_ptr(self): - 'Ensure that a NULL PyObject* is handled gracefully' - gdb_repr, gdb_output = ( - self.get_gdb_repr('id(42)', - cmds_after_breakpoint=['set variable v=0', - 'backtrace']) - ) - - self.assertEqual(gdb_repr, '0x0') - - def test_NULL_ob_type(self): - 'Ensure that a PyObject* with NULL ob_type is handled gracefully' - self.assertSane('id(42)', - 'set v->ob_type=0') - - def test_corrupt_ob_type(self): - 'Ensure that a PyObject* with a corrupt ob_type is handled gracefully' - self.assertSane('id(42)', - 'set v->ob_type=0xDEADBEEF', - exprepr='42') - - def test_corrupt_tp_flags(self): - 'Ensure that a PyObject* with a type with corrupt tp_flags is handled' - self.assertSane('id(42)', - 'set v->ob_type->tp_flags=0x0', - exprepr='42') - - def test_corrupt_tp_name(self): - 'Ensure that a PyObject* with a type with corrupt tp_name is handled' - self.assertSane('id(42)', - 'set v->ob_type->tp_name=0xDEADBEEF', - exprepr='42') - - def test_builtins_help(self): - 'Ensure that the new-style class _Helper in site.py can be handled' - - if sys.flags.no_site: - self.skipTest("need site module, but -S option was used") - - # (this was the issue causing tracebacks in - # http://bugs.python.org/issue8032#msg100537 ) - gdb_repr, gdb_output = self.get_gdb_repr('id(__builtins__.help)', import_site=True) - - m = re.match(r'<_Helper\(\) at remote 0x-?[0-9a-f]+>', gdb_repr) - self.assertTrue(m, - msg='Unexpected rendering %r' % gdb_repr) - - def test_selfreferential_list(self): - '''Ensure that a reference loop involving a list doesn't lead proxyval - into an infinite loop:''' - gdb_repr, gdb_output = \ - self.get_gdb_repr("a = [3, 4, 5] ; a.append(a) ; id(a)") - self.assertEqual(gdb_repr, '[3, 4, 5, [...]]') - - gdb_repr, gdb_output = \ - self.get_gdb_repr("a = [3, 4, 5] ; b = [a] ; a.append(b) ; id(a)") - self.assertEqual(gdb_repr, '[3, 4, 5, [[...]]]') - - def test_selfreferential_dict(self): - '''Ensure that a reference loop involving a dict doesn't lead proxyval - into an infinite loop:''' - gdb_repr, gdb_output = \ - self.get_gdb_repr("a = {} ; b = {'bar':a} ; a['foo'] = b ; id(a)") - - self.assertEqual(gdb_repr, "{'foo': {'bar': {...}}}") - - def test_selfreferential_old_style_instance(self): - gdb_repr, gdb_output = \ - self.get_gdb_repr(''' -class Foo: - pass -foo = Foo() -foo.an_attr = foo -id(foo)''') - self.assertTrue(re.match(r'\) at remote 0x-?[0-9a-f]+>', - gdb_repr), - 'Unexpected gdb representation: %r\n%s' % \ - (gdb_repr, gdb_output)) - - def test_selfreferential_new_style_instance(self): - gdb_repr, gdb_output = \ - self.get_gdb_repr(''' -class Foo(object): - pass -foo = Foo() -foo.an_attr = foo -id(foo)''') - self.assertTrue(re.match(r'\) at remote 0x-?[0-9a-f]+>', - gdb_repr), - 'Unexpected gdb representation: %r\n%s' % \ - (gdb_repr, gdb_output)) - - gdb_repr, gdb_output = \ - self.get_gdb_repr(''' -class Foo(object): - pass -a = Foo() -b = Foo() -a.an_attr = b -b.an_attr = a -id(a)''') - self.assertTrue(re.match(r'\) at remote 0x-?[0-9a-f]+>\) at remote 0x-?[0-9a-f]+>', - gdb_repr), - 'Unexpected gdb representation: %r\n%s' % \ - (gdb_repr, gdb_output)) - - def test_truncation(self): - 'Verify that very long output is truncated' - gdb_repr, gdb_output = self.get_gdb_repr('id(list(range(1000)))') - self.assertEqual(gdb_repr, - "[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, " - "14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, " - "27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, " - "40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, " - "53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, " - "66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, " - "79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, " - "92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, " - "104, 105, 106, 107, 108, 109, 110, 111, 112, 113, " - "114, 115, 116, 117, 118, 119, 120, 121, 122, 123, " - "124, 125, 126, 127, 128, 129, 130, 131, 132, 133, " - "134, 135, 136, 137, 138, 139, 140, 141, 142, 143, " - "144, 145, 146, 147, 148, 149, 150, 151, 152, 153, " - "154, 155, 156, 157, 158, 159, 160, 161, 162, 163, " - "164, 165, 166, 167, 168, 169, 170, 171, 172, 173, " - "174, 175, 176, 177, 178, 179, 180, 181, 182, 183, " - "184, 185, 186, 187, 188, 189, 190, 191, 192, 193, " - "194, 195, 196, 197, 198, 199, 200, 201, 202, 203, " - "204, 205, 206, 207, 208, 209, 210, 211, 212, 213, " - "214, 215, 216, 217, 218, 219, 220, 221, 222, 223, " - "224, 225, 226...(truncated)") - self.assertEqual(len(gdb_repr), - 1024 + len('...(truncated)')) - - def test_builtin_method(self): - gdb_repr, gdb_output = self.get_gdb_repr('import sys; id(sys.stdout.readlines)') - self.assertTrue(re.match(r'', - gdb_repr), - 'Unexpected gdb representation: %r\n%s' % \ - (gdb_repr, gdb_output)) - - def test_frames(self): - gdb_output = self.get_stack_trace(''' -import sys -def foo(a, b, c): - return sys._getframe(0) - -f = foo(3, 4, 5) -id(f)''', - breakpoint='builtin_id', - cmds_after_breakpoint=['print (PyFrameObject*)v'] - ) - self.assertTrue(re.match(r'.*\s+\$1 =\s+Frame 0x-?[0-9a-f]+, for file , line 4, in foo \(a=3.*', - gdb_output, - re.DOTALL), - 'Unexpected gdb representation: %r\n%s' % (gdb_output, gdb_output)) - -@unittest.skipIf(python_is_optimized(), - "Python was compiled with optimizations") -class PyListTests(DebuggerTests): - def assertListing(self, expected, actual): - self.assertEndsWith(actual, expected) - - def test_basic_command(self): - 'Verify that the "py-list" command works' - bt = self.get_stack_trace(script=self.get_sample_script(), - cmds_after_breakpoint=['py-list']) - - self.assertListing(' 5 \n' - ' 6 def bar(a, b, c):\n' - ' 7 baz(a, b, c)\n' - ' 8 \n' - ' 9 def baz(*args):\n' - ' >10 id(42)\n' - ' 11 \n' - ' 12 foo(1, 2, 3)\n', - bt) - - def test_one_abs_arg(self): - 'Verify the "py-list" command with one absolute argument' - bt = self.get_stack_trace(script=self.get_sample_script(), - cmds_after_breakpoint=['py-list 9']) - - self.assertListing(' 9 def baz(*args):\n' - ' >10 id(42)\n' - ' 11 \n' - ' 12 foo(1, 2, 3)\n', - bt) - - def test_two_abs_args(self): - 'Verify the "py-list" command with two absolute arguments' - bt = self.get_stack_trace(script=self.get_sample_script(), - cmds_after_breakpoint=['py-list 1,3']) - - self.assertListing(' 1 # Sample script for use by test_gdb.py\n' - ' 2 \n' - ' 3 def foo(a, b, c):\n', - bt) - -SAMPLE_WITH_C_CALL = """ - -from _testcapi import pyobject_fastcall - -def foo(a, b, c): - bar(a, b, c) - -def bar(a, b, c): - pyobject_fastcall(baz, (a, b, c)) - -def baz(*args): - id(42) - -foo(1, 2, 3) - -""" - - -class StackNavigationTests(DebuggerTests): - @unittest.skipUnless(HAS_PYUP_PYDOWN, "test requires py-up/py-down commands") - @unittest.skipIf(python_is_optimized(), - "Python was compiled with optimizations") - def test_pyup_command(self): - 'Verify that the "py-up" command works' - bt = self.get_stack_trace(source=SAMPLE_WITH_C_CALL, - cmds_after_breakpoint=['py-up', 'py-up']) - self.assertMultilineMatches(bt, - r'''^.* -#[0-9]+ Frame 0x-?[0-9a-f]+, for file , line 12, in baz \(args=\(1, 2, 3\)\) -#[0-9]+ -$''') - - @unittest.skipUnless(HAS_PYUP_PYDOWN, "test requires py-up/py-down commands") - def test_down_at_bottom(self): - 'Verify handling of "py-down" at the bottom of the stack' - bt = self.get_stack_trace(script=self.get_sample_script(), - cmds_after_breakpoint=['py-down']) - self.assertEndsWith(bt, - 'Unable to find a newer python frame\n') - - @unittest.skipUnless(HAS_PYUP_PYDOWN, "test requires py-up/py-down commands") - def test_up_at_top(self): - 'Verify handling of "py-up" at the top of the stack' - bt = self.get_stack_trace(script=self.get_sample_script(), - cmds_after_breakpoint=['py-up'] * 5) - self.assertEndsWith(bt, - 'Unable to find an older python frame\n') - - @unittest.skipUnless(HAS_PYUP_PYDOWN, "test requires py-up/py-down commands") - @unittest.skipIf(python_is_optimized(), - "Python was compiled with optimizations") - def test_up_then_down(self): - 'Verify "py-up" followed by "py-down"' - bt = self.get_stack_trace(source=SAMPLE_WITH_C_CALL, - cmds_after_breakpoint=['py-up', 'py-up', 'py-down']) - self.assertMultilineMatches(bt, - r'''^.* -#[0-9]+ Frame 0x-?[0-9a-f]+, for file , line 12, in baz \(args=\(1, 2, 3\)\) -#[0-9]+ -#[0-9]+ Frame 0x-?[0-9a-f]+, for file , line 12, in baz \(args=\(1, 2, 3\)\) -$''') - -class PyBtTests(DebuggerTests): - @unittest.skipIf(python_is_optimized(), - "Python was compiled with optimizations") - def test_bt(self): - 'Verify that the "py-bt" command works' - bt = self.get_stack_trace(script=self.get_sample_script(), - cmds_after_breakpoint=['py-bt']) - self.assertMultilineMatches(bt, - r'''^.* -Traceback \(most recent call first\): - - File ".*gdb_sample.py", line 10, in baz - id\(42\) - File ".*gdb_sample.py", line 7, in bar - baz\(a, b, c\) - File ".*gdb_sample.py", line 4, in foo - bar\(a=a, b=b, c=c\) - File ".*gdb_sample.py", line 12, in - foo\(1, 2, 3\) -''') - - @unittest.skipIf(python_is_optimized(), - "Python was compiled with optimizations") - def test_bt_full(self): - 'Verify that the "py-bt-full" command works' - bt = self.get_stack_trace(script=self.get_sample_script(), - cmds_after_breakpoint=['py-bt-full']) - self.assertMultilineMatches(bt, - r'''^.* -#[0-9]+ Frame 0x-?[0-9a-f]+, for file .*gdb_sample.py, line 7, in bar \(a=1, b=2, c=3\) - baz\(a, b, c\) -#[0-9]+ Frame 0x-?[0-9a-f]+, for file .*gdb_sample.py, line 4, in foo \(a=1, b=2, c=3\) - bar\(a=a, b=b, c=c\) -#[0-9]+ Frame 0x-?[0-9a-f]+, for file .*gdb_sample.py, line 12, in \(\) - foo\(1, 2, 3\) -''') - - @unittest.skipIf(python_is_optimized(), - "Python was compiled with optimizations") - @support.requires_resource('cpu') - def test_threads(self): - 'Verify that "py-bt" indicates threads that are waiting for the GIL' - cmd = ''' -from threading import Thread - -class TestThread(Thread): - # These threads would run forever, but we'll interrupt things with the - # debugger - def run(self): - i = 0 - while 1: - i += 1 - -t = {} -for i in range(4): - t[i] = TestThread() - t[i].start() - -# Trigger a breakpoint on the main thread -id(42) - -''' - # Verify with "py-bt": - gdb_output = self.get_stack_trace(cmd, - cmds_after_breakpoint=['thread apply all py-bt']) - self.assertIn('Waiting for the GIL', gdb_output) - - # Verify with "py-bt-full": - gdb_output = self.get_stack_trace(cmd, - cmds_after_breakpoint=['thread apply all py-bt-full']) - self.assertIn('Waiting for the GIL', gdb_output) - - @unittest.skipIf(python_is_optimized(), - "Python was compiled with optimizations") - # Some older versions of gdb will fail with - # "Cannot find new threads: generic error" - # unless we add LD_PRELOAD=PATH-TO-libpthread.so.1 as a workaround - def test_gc(self): - 'Verify that "py-bt" indicates if a thread is garbage-collecting' - cmd = ('from gc import collect\n' - 'id(42)\n' - 'def foo():\n' - ' collect()\n' - 'def bar():\n' - ' foo()\n' - 'bar()\n') - # Verify with "py-bt": - gdb_output = self.get_stack_trace(cmd, - cmds_after_breakpoint=['break update_refs', 'continue', 'py-bt'], - ) - self.assertIn('Garbage-collecting', gdb_output) - - # Verify with "py-bt-full": - gdb_output = self.get_stack_trace(cmd, - cmds_after_breakpoint=['break update_refs', 'continue', 'py-bt-full'], - ) - self.assertIn('Garbage-collecting', gdb_output) - - - @unittest.skipIf(python_is_optimized(), - "Python was compiled with optimizations") - @support.requires_resource('cpu') - # Some older versions of gdb will fail with - # "Cannot find new threads: generic error" - # unless we add LD_PRELOAD=PATH-TO-libpthread.so.1 as a workaround - # - # gdb will also generate many erroneous errors such as: - # Function "meth_varargs" not defined. - # This is because we are calling functions from an "external" module - # (_testcapimodule) rather than compiled-in functions. It seems difficult - # to suppress these. See also the comment in DebuggerTests.get_stack_trace - def test_pycfunction(self): - 'Verify that "py-bt" displays invocations of PyCFunction instances' - # bpo-46600: If the compiler inlines _null_to_none() in meth_varargs() - # (ex: clang -Og), _null_to_none() is the frame #1. Otherwise, - # meth_varargs() is the frame #1. - expected_frame = r'#(1|2)' - # Various optimizations multiply the code paths by which these are - # called, so test a variety of calling conventions. - for func_name, args in ( - ('meth_varargs', ''), - ('meth_varargs_keywords', ''), - ('meth_o', '[]'), - ('meth_noargs', ''), - ('meth_fastcall', ''), - ('meth_fastcall_keywords', ''), - ): - for obj in ( - '_testcapi', - '_testcapi.MethClass', - '_testcapi.MethClass()', - '_testcapi.MethStatic()', - - # XXX: bound methods don't yet give nice tracebacks - # '_testcapi.MethInstance()', - ): - with self.subTest(f'{obj}.{func_name}'): - cmd = textwrap.dedent(f''' - import _testcapi - def foo(): - {obj}.{func_name}({args}) - def bar(): - foo() - bar() - ''') - # Verify with "py-bt": - gdb_output = self.get_stack_trace( - cmd, - breakpoint=func_name, - cmds_after_breakpoint=['bt', 'py-bt'], - # bpo-45207: Ignore 'Function "meth_varargs" not - # defined.' message in stderr. - ignore_stderr=True, - ) - self.assertIn(f'\n.*") - -class PyLocalsTests(DebuggerTests): - @unittest.skipIf(python_is_optimized(), - "Python was compiled with optimizations") - def test_basic_command(self): - bt = self.get_stack_trace(script=self.get_sample_script(), - cmds_after_breakpoint=['py-up', 'py-locals']) - self.assertMultilineMatches(bt, - r".*\nargs = \(1, 2, 3\)\n.*") - - @unittest.skipUnless(HAS_PYUP_PYDOWN, "test requires py-up/py-down commands") - @unittest.skipIf(python_is_optimized(), - "Python was compiled with optimizations") - def test_locals_after_up(self): - bt = self.get_stack_trace(script=self.get_sample_script(), - cmds_after_breakpoint=['py-up', 'py-up', 'py-locals']) - self.assertMultilineMatches(bt, - r'''^.* -Locals for foo -a = 1 -b = 2 -c = 3 -Locals for -.*$''') - - -def setUpModule(): - if support.verbose: - print("GDB version %s.%s:" % (gdb_major_version, gdb_minor_version)) - for line in gdb_version.splitlines(): - print(" " * 4 + line) - - -if __name__ == "__main__": - unittest.main() diff --git a/Lib/test/test_gdb/__init__.py b/Lib/test/test_gdb/__init__.py new file mode 100644 index 00000000000000..0261f59adf54bd --- /dev/null +++ b/Lib/test/test_gdb/__init__.py @@ -0,0 +1,10 @@ +# Verify that gdb can pretty-print the various PyObject* types +# +# The code for testing gdb was adapted from similar work in Unladen Swallow's +# Lib/test/test_jit_gdb.py + +import os +from test.support import load_package_tests + +def load_tests(*args): + return load_package_tests(os.path.dirname(__file__), *args) diff --git a/Lib/test/gdb_sample.py b/Lib/test/test_gdb/gdb_sample.py similarity index 75% rename from Lib/test/gdb_sample.py rename to Lib/test/test_gdb/gdb_sample.py index 4188f50136fb97..a7f23db73ea6e6 100644 --- a/Lib/test/gdb_sample.py +++ b/Lib/test/test_gdb/gdb_sample.py @@ -1,4 +1,4 @@ -# Sample script for use by test_gdb.py +# Sample script for use by test_gdb def foo(a, b, c): bar(a=a, b=b, c=c) diff --git a/Lib/test/test_gdb/test_backtrace.py b/Lib/test/test_gdb/test_backtrace.py new file mode 100644 index 00000000000000..15cbcf169ab9e3 --- /dev/null +++ b/Lib/test/test_gdb/test_backtrace.py @@ -0,0 +1,134 @@ +import textwrap +import unittest +from test import support +from test.support import python_is_optimized + +from .util import setup_module, DebuggerTests, CET_PROTECTION + + +def setUpModule(): + setup_module() + + +class PyBtTests(DebuggerTests): + @unittest.skipIf(python_is_optimized(), + "Python was compiled with optimizations") + def test_bt(self): + 'Verify that the "py-bt" command works' + bt = self.get_stack_trace(script=self.get_sample_script(), + cmds_after_breakpoint=['py-bt']) + self.assertMultilineMatches(bt, + r'''^.* +Traceback \(most recent call first\): + + File ".*gdb_sample.py", line 10, in baz + id\(42\) + File ".*gdb_sample.py", line 7, in bar + baz\(a, b, c\) + File ".*gdb_sample.py", line 4, in foo + bar\(a=a, b=b, c=c\) + File ".*gdb_sample.py", line 12, in + foo\(1, 2, 3\) +''') + + @unittest.skipIf(python_is_optimized(), + "Python was compiled with optimizations") + def test_bt_full(self): + 'Verify that the "py-bt-full" command works' + bt = self.get_stack_trace(script=self.get_sample_script(), + cmds_after_breakpoint=['py-bt-full']) + self.assertMultilineMatches(bt, + r'''^.* +#[0-9]+ Frame 0x-?[0-9a-f]+, for file .*gdb_sample.py, line 7, in bar \(a=1, b=2, c=3\) + baz\(a, b, c\) +#[0-9]+ Frame 0x-?[0-9a-f]+, for file .*gdb_sample.py, line 4, in foo \(a=1, b=2, c=3\) + bar\(a=a, b=b, c=c\) +#[0-9]+ Frame 0x-?[0-9a-f]+, for file .*gdb_sample.py, line 12, in \(\) + foo\(1, 2, 3\) +''') + + @unittest.skipIf(python_is_optimized(), + "Python was compiled with optimizations") + @support.requires_resource('cpu') + def test_threads(self): + 'Verify that "py-bt" indicates threads that are waiting for the GIL' + cmd = ''' +from threading import Thread + +class TestThread(Thread): + # These threads would run forever, but we'll interrupt things with the + # debugger + def run(self): + i = 0 + while 1: + i += 1 + +t = {} +for i in range(4): + t[i] = TestThread() + t[i].start() + +# Trigger a breakpoint on the main thread +id(42) + +''' + # Verify with "py-bt": + gdb_output = self.get_stack_trace(cmd, + cmds_after_breakpoint=['thread apply all py-bt']) + self.assertIn('Waiting for the GIL', gdb_output) + + # Verify with "py-bt-full": + gdb_output = self.get_stack_trace(cmd, + cmds_after_breakpoint=['thread apply all py-bt-full']) + self.assertIn('Waiting for the GIL', gdb_output) + + @unittest.skipIf(python_is_optimized(), + "Python was compiled with optimizations") + # Some older versions of gdb will fail with + # "Cannot find new threads: generic error" + # unless we add LD_PRELOAD=PATH-TO-libpthread.so.1 as a workaround + def test_gc(self): + 'Verify that "py-bt" indicates if a thread is garbage-collecting' + cmd = ('from gc import collect\n' + 'id(42)\n' + 'def foo():\n' + ' collect()\n' + 'def bar():\n' + ' foo()\n' + 'bar()\n') + # Verify with "py-bt": + gdb_output = self.get_stack_trace(cmd, + cmds_after_breakpoint=['break update_refs', 'continue', 'py-bt'], + ) + self.assertIn('Garbage-collecting', gdb_output) + + # Verify with "py-bt-full": + gdb_output = self.get_stack_trace(cmd, + cmds_after_breakpoint=['break update_refs', 'continue', 'py-bt-full'], + ) + self.assertIn('Garbage-collecting', gdb_output) + + @unittest.skipIf(python_is_optimized(), + "Python was compiled with optimizations") + def test_wrapper_call(self): + cmd = textwrap.dedent(''' + class MyList(list): + def __init__(self): + super(*[]).__init__() # wrapper_call() + + id("first break point") + l = MyList() + ''') + cmds_after_breakpoint = ['break wrapper_call', 'continue'] + if CET_PROTECTION: + # bpo-32962: same case as in get_stack_trace(): + # we need an additional 'next' command in order to read + # arguments of the innermost function of the call stack. + cmds_after_breakpoint.append('next') + cmds_after_breakpoint.append('py-bt') + + # Verify with "py-bt": + gdb_output = self.get_stack_trace(cmd, + cmds_after_breakpoint=cmds_after_breakpoint) + self.assertRegex(gdb_output, + r"10 id(42)\n' + ' 11 \n' + ' 12 foo(1, 2, 3)\n', + bt) + + def test_one_abs_arg(self): + 'Verify the "py-list" command with one absolute argument' + bt = self.get_stack_trace(script=self.get_sample_script(), + cmds_after_breakpoint=['py-list 9']) + + self.assertListing(' 9 def baz(*args):\n' + ' >10 id(42)\n' + ' 11 \n' + ' 12 foo(1, 2, 3)\n', + bt) + + def test_two_abs_args(self): + 'Verify the "py-list" command with two absolute arguments' + bt = self.get_stack_trace(script=self.get_sample_script(), + cmds_after_breakpoint=['py-list 1,3']) + + self.assertListing(' 1 # Sample script for use by test_gdb\n' + ' 2 \n' + ' 3 def foo(a, b, c):\n', + bt) + +SAMPLE_WITH_C_CALL = """ + +from _testcapi import pyobject_vectorcall + +def foo(a, b, c): + bar(a, b, c) + +def bar(a, b, c): + pyobject_vectorcall(baz, (a, b, c), None) + +def baz(*args): + id(42) + +foo(1, 2, 3) + +""" + + +class StackNavigationTests(DebuggerTests): + @unittest.skipUnless(HAS_PYUP_PYDOWN, "test requires py-up/py-down commands") + @unittest.skipIf(python_is_optimized(), + "Python was compiled with optimizations") + def test_pyup_command(self): + 'Verify that the "py-up" command works' + bt = self.get_stack_trace(source=SAMPLE_WITH_C_CALL, + cmds_after_breakpoint=['py-up', 'py-up']) + self.assertMultilineMatches(bt, + r'''^.* +#[0-9]+ Frame 0x-?[0-9a-f]+, for file , line 12, in baz \(args=\(1, 2, 3\)\) +#[0-9]+ +$''') + + @unittest.skipUnless(HAS_PYUP_PYDOWN, "test requires py-up/py-down commands") + def test_down_at_bottom(self): + 'Verify handling of "py-down" at the bottom of the stack' + bt = self.get_stack_trace(script=self.get_sample_script(), + cmds_after_breakpoint=['py-down']) + self.assertEndsWith(bt, + 'Unable to find a newer python frame\n') + + @unittest.skipUnless(HAS_PYUP_PYDOWN, "test requires py-up/py-down commands") + def test_up_at_top(self): + 'Verify handling of "py-up" at the top of the stack' + bt = self.get_stack_trace(script=self.get_sample_script(), + cmds_after_breakpoint=['py-up'] * 5) + self.assertEndsWith(bt, + 'Unable to find an older python frame\n') + + @unittest.skipUnless(HAS_PYUP_PYDOWN, "test requires py-up/py-down commands") + @unittest.skipIf(python_is_optimized(), + "Python was compiled with optimizations") + def test_up_then_down(self): + 'Verify "py-up" followed by "py-down"' + bt = self.get_stack_trace(source=SAMPLE_WITH_C_CALL, + cmds_after_breakpoint=['py-up', 'py-up', 'py-down']) + self.assertMultilineMatches(bt, + r'''^.* +#[0-9]+ Frame 0x-?[0-9a-f]+, for file , line 12, in baz \(args=\(1, 2, 3\)\) +#[0-9]+ +#[0-9]+ Frame 0x-?[0-9a-f]+, for file , line 12, in baz \(args=\(1, 2, 3\)\) +$''') + +class PyPrintTests(DebuggerTests): + @unittest.skipIf(python_is_optimized(), + "Python was compiled with optimizations") + def test_basic_command(self): + 'Verify that the "py-print" command works' + bt = self.get_stack_trace(source=SAMPLE_WITH_C_CALL, + cmds_after_breakpoint=['py-up', 'py-print args']) + self.assertMultilineMatches(bt, + r".*\nlocal 'args' = \(1, 2, 3\)\n.*") + + @unittest.skipIf(python_is_optimized(), + "Python was compiled with optimizations") + @unittest.skipUnless(HAS_PYUP_PYDOWN, "test requires py-up/py-down commands") + def test_print_after_up(self): + bt = self.get_stack_trace(source=SAMPLE_WITH_C_CALL, + cmds_after_breakpoint=['py-up', 'py-up', 'py-print c', 'py-print b', 'py-print a']) + self.assertMultilineMatches(bt, + r".*\nlocal 'c' = 3\nlocal 'b' = 2\nlocal 'a' = 1\n.*") + + @unittest.skipIf(python_is_optimized(), + "Python was compiled with optimizations") + def test_printing_global(self): + bt = self.get_stack_trace(script=self.get_sample_script(), + cmds_after_breakpoint=['py-up', 'py-print __name__']) + self.assertMultilineMatches(bt, + r".*\nglobal '__name__' = '__main__'\n.*") + + @unittest.skipIf(python_is_optimized(), + "Python was compiled with optimizations") + def test_printing_builtin(self): + bt = self.get_stack_trace(script=self.get_sample_script(), + cmds_after_breakpoint=['py-up', 'py-print len']) + self.assertMultilineMatches(bt, + r".*\nbuiltin 'len' = \n.*") + +class PyLocalsTests(DebuggerTests): + @unittest.skipIf(python_is_optimized(), + "Python was compiled with optimizations") + def test_basic_command(self): + bt = self.get_stack_trace(script=self.get_sample_script(), + cmds_after_breakpoint=['py-up', 'py-locals']) + self.assertMultilineMatches(bt, + r".*\nargs = \(1, 2, 3\)\n.*") + + @unittest.skipUnless(HAS_PYUP_PYDOWN, "test requires py-up/py-down commands") + @unittest.skipIf(python_is_optimized(), + "Python was compiled with optimizations") + def test_locals_after_up(self): + bt = self.get_stack_trace(script=self.get_sample_script(), + cmds_after_breakpoint=['py-up', 'py-up', 'py-locals']) + self.assertMultilineMatches(bt, + r'''^.* +Locals for foo +a = 1 +b = 2 +c = 3 +Locals for +.*$''') diff --git a/Lib/test/test_gdb/test_pretty_print.py b/Lib/test/test_gdb/test_pretty_print.py new file mode 100644 index 00000000000000..e31dc66f29684a --- /dev/null +++ b/Lib/test/test_gdb/test_pretty_print.py @@ -0,0 +1,400 @@ +import re +import sys +from test import support + +from .util import ( + BREAKPOINT_FN, gdb_major_version, gdb_minor_version, + run_gdb, setup_module, DebuggerTests) + + +def setUpModule(): + setup_module() + + +class PrettyPrintTests(DebuggerTests): + def test_getting_backtrace(self): + gdb_output = self.get_stack_trace('id(42)') + self.assertTrue(BREAKPOINT_FN in gdb_output) + + def assertGdbRepr(self, val, exp_repr=None): + # Ensure that gdb's rendering of the value in a debugged process + # matches repr(value) in this process: + gdb_repr, gdb_output = self.get_gdb_repr('id(' + ascii(val) + ')') + if not exp_repr: + exp_repr = repr(val) + self.assertEqual(gdb_repr, exp_repr, + ('%r did not equal expected %r; full output was:\n%s' + % (gdb_repr, exp_repr, gdb_output))) + + @support.requires_resource('cpu') + def test_int(self): + 'Verify the pretty-printing of various int values' + self.assertGdbRepr(42) + self.assertGdbRepr(0) + self.assertGdbRepr(-7) + self.assertGdbRepr(1000000000000) + self.assertGdbRepr(-1000000000000000) + + def test_singletons(self): + 'Verify the pretty-printing of True, False and None' + self.assertGdbRepr(True) + self.assertGdbRepr(False) + self.assertGdbRepr(None) + + def test_dicts(self): + 'Verify the pretty-printing of dictionaries' + self.assertGdbRepr({}) + self.assertGdbRepr({'foo': 'bar'}, "{'foo': 'bar'}") + # Python preserves insertion order since 3.6 + self.assertGdbRepr({'foo': 'bar', 'douglas': 42}, "{'foo': 'bar', 'douglas': 42}") + + def test_lists(self): + 'Verify the pretty-printing of lists' + self.assertGdbRepr([]) + self.assertGdbRepr(list(range(5))) + + @support.requires_resource('cpu') + def test_bytes(self): + 'Verify the pretty-printing of bytes' + self.assertGdbRepr(b'') + self.assertGdbRepr(b'And now for something hopefully the same') + self.assertGdbRepr(b'string with embedded NUL here \0 and then some more text') + self.assertGdbRepr(b'this is a tab:\t' + b' this is a slash-N:\n' + b' this is a slash-R:\r' + ) + + self.assertGdbRepr(b'this is byte 255:\xff and byte 128:\x80') + + self.assertGdbRepr(bytes([b for b in range(255)])) + + @support.requires_resource('cpu') + def test_strings(self): + 'Verify the pretty-printing of unicode strings' + # We cannot simply call locale.getpreferredencoding() here, + # as GDB might have been linked against a different version + # of Python with a different encoding and coercion policy + # with respect to PEP 538 and PEP 540. + out, err = run_gdb( + '--eval-command', + 'python import locale; print(locale.getpreferredencoding())') + + encoding = out.rstrip() + if err or not encoding: + raise RuntimeError( + f'unable to determine the preferred encoding ' + f'of embedded Python in GDB: {err}') + + def check_repr(text): + try: + text.encode(encoding) + except UnicodeEncodeError: + self.assertGdbRepr(text, ascii(text)) + else: + self.assertGdbRepr(text) + + self.assertGdbRepr('') + self.assertGdbRepr('And now for something hopefully the same') + self.assertGdbRepr('string with embedded NUL here \0 and then some more text') + + # Test printing a single character: + # U+2620 SKULL AND CROSSBONES + check_repr('\u2620') + + # Test printing a Japanese unicode string + # (I believe this reads "mojibake", using 3 characters from the CJK + # Unified Ideographs area, followed by U+3051 HIRAGANA LETTER KE) + check_repr('\u6587\u5b57\u5316\u3051') + + # Test a character outside the BMP: + # U+1D121 MUSICAL SYMBOL C CLEF + # This is: + # UTF-8: 0xF0 0x9D 0x84 0xA1 + # UTF-16: 0xD834 0xDD21 + check_repr(chr(0x1D121)) + + def test_tuples(self): + 'Verify the pretty-printing of tuples' + self.assertGdbRepr(tuple(), '()') + self.assertGdbRepr((1,), '(1,)') + self.assertGdbRepr(('foo', 'bar', 'baz')) + + @support.requires_resource('cpu') + def test_sets(self): + 'Verify the pretty-printing of sets' + if (gdb_major_version, gdb_minor_version) < (7, 3): + self.skipTest("pretty-printing of sets needs gdb 7.3 or later") + self.assertGdbRepr(set(), "set()") + self.assertGdbRepr(set(['a']), "{'a'}") + # PYTHONHASHSEED is need to get the exact frozenset item order + if not sys.flags.ignore_environment: + self.assertGdbRepr(set(['a', 'b']), "{'a', 'b'}") + self.assertGdbRepr(set([4, 5, 6]), "{4, 5, 6}") + + # Ensure that we handle sets containing the "dummy" key value, + # which happens on deletion: + gdb_repr, gdb_output = self.get_gdb_repr('''s = set(['a','b']) +s.remove('a') +id(s)''') + self.assertEqual(gdb_repr, "{'b'}") + + @support.requires_resource('cpu') + def test_frozensets(self): + 'Verify the pretty-printing of frozensets' + if (gdb_major_version, gdb_minor_version) < (7, 3): + self.skipTest("pretty-printing of frozensets needs gdb 7.3 or later") + self.assertGdbRepr(frozenset(), "frozenset()") + self.assertGdbRepr(frozenset(['a']), "frozenset({'a'})") + # PYTHONHASHSEED is need to get the exact frozenset item order + if not sys.flags.ignore_environment: + self.assertGdbRepr(frozenset(['a', 'b']), "frozenset({'a', 'b'})") + self.assertGdbRepr(frozenset([4, 5, 6]), "frozenset({4, 5, 6})") + + def test_exceptions(self): + # Test a RuntimeError + gdb_repr, gdb_output = self.get_gdb_repr(''' +try: + raise RuntimeError("I am an error") +except RuntimeError as e: + id(e) +''') + self.assertEqual(gdb_repr, + "RuntimeError('I am an error',)") + + + # Test division by zero: + gdb_repr, gdb_output = self.get_gdb_repr(''' +try: + a = 1 / 0 +except ZeroDivisionError as e: + id(e) +''') + self.assertEqual(gdb_repr, + "ZeroDivisionError('division by zero',)") + + def test_modern_class(self): + 'Verify the pretty-printing of new-style class instances' + gdb_repr, gdb_output = self.get_gdb_repr(''' +class Foo: + pass +foo = Foo() +foo.an_int = 42 +id(foo)''') + m = re.match(r'', gdb_repr) + self.assertTrue(m, + msg='Unexpected new-style class rendering %r' % gdb_repr) + + def test_subclassing_list(self): + 'Verify the pretty-printing of an instance of a list subclass' + gdb_repr, gdb_output = self.get_gdb_repr(''' +class Foo(list): + pass +foo = Foo() +foo += [1, 2, 3] +foo.an_int = 42 +id(foo)''') + m = re.match(r'', gdb_repr) + + self.assertTrue(m, + msg='Unexpected new-style class rendering %r' % gdb_repr) + + def test_subclassing_tuple(self): + 'Verify the pretty-printing of an instance of a tuple subclass' + # This should exercise the negative tp_dictoffset code in the + # new-style class support + gdb_repr, gdb_output = self.get_gdb_repr(''' +class Foo(tuple): + pass +foo = Foo((1, 2, 3)) +foo.an_int = 42 +id(foo)''') + m = re.match(r'', gdb_repr) + + self.assertTrue(m, + msg='Unexpected new-style class rendering %r' % gdb_repr) + + def assertSane(self, source, corruption, exprepr=None): + '''Run Python under gdb, corrupting variables in the inferior process + immediately before taking a backtrace. + + Verify that the variable's representation is the expected failsafe + representation''' + if corruption: + cmds_after_breakpoint=[corruption, 'backtrace'] + else: + cmds_after_breakpoint=['backtrace'] + + gdb_repr, gdb_output = \ + self.get_gdb_repr(source, + cmds_after_breakpoint=cmds_after_breakpoint) + if exprepr: + if gdb_repr == exprepr: + # gdb managed to print the value in spite of the corruption; + # this is good (see http://bugs.python.org/issue8330) + return + + # Match anything for the type name; 0xDEADBEEF could point to + # something arbitrary (see http://bugs.python.org/issue8330) + pattern = '<.* at remote 0x-?[0-9a-f]+>' + + m = re.match(pattern, gdb_repr) + if not m: + self.fail('Unexpected gdb representation: %r\n%s' % \ + (gdb_repr, gdb_output)) + + def test_NULL_ptr(self): + 'Ensure that a NULL PyObject* is handled gracefully' + gdb_repr, gdb_output = ( + self.get_gdb_repr('id(42)', + cmds_after_breakpoint=['set variable v=0', + 'backtrace']) + ) + + self.assertEqual(gdb_repr, '0x0') + + def test_NULL_ob_type(self): + 'Ensure that a PyObject* with NULL ob_type is handled gracefully' + self.assertSane('id(42)', + 'set v->ob_type=0') + + def test_corrupt_ob_type(self): + 'Ensure that a PyObject* with a corrupt ob_type is handled gracefully' + self.assertSane('id(42)', + 'set v->ob_type=0xDEADBEEF', + exprepr='42') + + def test_corrupt_tp_flags(self): + 'Ensure that a PyObject* with a type with corrupt tp_flags is handled' + self.assertSane('id(42)', + 'set v->ob_type->tp_flags=0x0', + exprepr='42') + + def test_corrupt_tp_name(self): + 'Ensure that a PyObject* with a type with corrupt tp_name is handled' + self.assertSane('id(42)', + 'set v->ob_type->tp_name=0xDEADBEEF', + exprepr='42') + + def test_builtins_help(self): + 'Ensure that the new-style class _Helper in site.py can be handled' + + if sys.flags.no_site: + self.skipTest("need site module, but -S option was used") + + # (this was the issue causing tracebacks in + # http://bugs.python.org/issue8032#msg100537 ) + gdb_repr, gdb_output = self.get_gdb_repr('id(__builtins__.help)', import_site=True) + + m = re.match(r'<_Helper\(\) at remote 0x-?[0-9a-f]+>', gdb_repr) + self.assertTrue(m, + msg='Unexpected rendering %r' % gdb_repr) + + def test_selfreferential_list(self): + '''Ensure that a reference loop involving a list doesn't lead proxyval + into an infinite loop:''' + gdb_repr, gdb_output = \ + self.get_gdb_repr("a = [3, 4, 5] ; a.append(a) ; id(a)") + self.assertEqual(gdb_repr, '[3, 4, 5, [...]]') + + gdb_repr, gdb_output = \ + self.get_gdb_repr("a = [3, 4, 5] ; b = [a] ; a.append(b) ; id(a)") + self.assertEqual(gdb_repr, '[3, 4, 5, [[...]]]') + + def test_selfreferential_dict(self): + '''Ensure that a reference loop involving a dict doesn't lead proxyval + into an infinite loop:''' + gdb_repr, gdb_output = \ + self.get_gdb_repr("a = {} ; b = {'bar':a} ; a['foo'] = b ; id(a)") + + self.assertEqual(gdb_repr, "{'foo': {'bar': {...}}}") + + def test_selfreferential_old_style_instance(self): + gdb_repr, gdb_output = \ + self.get_gdb_repr(''' +class Foo: + pass +foo = Foo() +foo.an_attr = foo +id(foo)''') + self.assertTrue(re.match(r'\) at remote 0x-?[0-9a-f]+>', + gdb_repr), + 'Unexpected gdb representation: %r\n%s' % \ + (gdb_repr, gdb_output)) + + def test_selfreferential_new_style_instance(self): + gdb_repr, gdb_output = \ + self.get_gdb_repr(''' +class Foo(object): + pass +foo = Foo() +foo.an_attr = foo +id(foo)''') + self.assertTrue(re.match(r'\) at remote 0x-?[0-9a-f]+>', + gdb_repr), + 'Unexpected gdb representation: %r\n%s' % \ + (gdb_repr, gdb_output)) + + gdb_repr, gdb_output = \ + self.get_gdb_repr(''' +class Foo(object): + pass +a = Foo() +b = Foo() +a.an_attr = b +b.an_attr = a +id(a)''') + self.assertTrue(re.match(r'\) at remote 0x-?[0-9a-f]+>\) at remote 0x-?[0-9a-f]+>', + gdb_repr), + 'Unexpected gdb representation: %r\n%s' % \ + (gdb_repr, gdb_output)) + + def test_truncation(self): + 'Verify that very long output is truncated' + gdb_repr, gdb_output = self.get_gdb_repr('id(list(range(1000)))') + self.assertEqual(gdb_repr, + "[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, " + "14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, " + "27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, " + "40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, " + "53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, " + "66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, " + "79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, " + "92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, " + "104, 105, 106, 107, 108, 109, 110, 111, 112, 113, " + "114, 115, 116, 117, 118, 119, 120, 121, 122, 123, " + "124, 125, 126, 127, 128, 129, 130, 131, 132, 133, " + "134, 135, 136, 137, 138, 139, 140, 141, 142, 143, " + "144, 145, 146, 147, 148, 149, 150, 151, 152, 153, " + "154, 155, 156, 157, 158, 159, 160, 161, 162, 163, " + "164, 165, 166, 167, 168, 169, 170, 171, 172, 173, " + "174, 175, 176, 177, 178, 179, 180, 181, 182, 183, " + "184, 185, 186, 187, 188, 189, 190, 191, 192, 193, " + "194, 195, 196, 197, 198, 199, 200, 201, 202, 203, " + "204, 205, 206, 207, 208, 209, 210, 211, 212, 213, " + "214, 215, 216, 217, 218, 219, 220, 221, 222, 223, " + "224, 225, 226...(truncated)") + self.assertEqual(len(gdb_repr), + 1024 + len('...(truncated)')) + + def test_builtin_method(self): + gdb_repr, gdb_output = self.get_gdb_repr('import sys; id(sys.stdout.readlines)') + self.assertTrue(re.match(r'', + gdb_repr), + 'Unexpected gdb representation: %r\n%s' % \ + (gdb_repr, gdb_output)) + + def test_frames(self): + gdb_output = self.get_stack_trace(''' +import sys +def foo(a, b, c): + return sys._getframe(0) + +f = foo(3, 4, 5) +id(f)''', + breakpoint='builtin_id', + cmds_after_breakpoint=['print (PyFrameObject*)v'] + ) + self.assertTrue(re.match(r'.*\s+\$1 =\s+Frame 0x-?[0-9a-f]+, for file , line 4, in foo \(a=3.*', + gdb_output, + re.DOTALL), + 'Unexpected gdb representation: %r\n%s' % (gdb_output, gdb_output)) diff --git a/Lib/test/test_gdb/util.py b/Lib/test/test_gdb/util.py new file mode 100644 index 00000000000000..30beb4e14285c7 --- /dev/null +++ b/Lib/test/test_gdb/util.py @@ -0,0 +1,304 @@ +import os +import re +import subprocess +import sys +import sysconfig +import unittest +from test import support + + +MS_WINDOWS = (sys.platform == 'win32') +if MS_WINDOWS: + raise unittest.SkipTest("test_gdb doesn't work on Windows") + + +def get_gdb_version(): + try: + cmd = ["gdb", "-nx", "--version"] + proc = subprocess.Popen(cmd, + stdout=subprocess.PIPE, + stderr=subprocess.PIPE, + universal_newlines=True) + with proc: + version, stderr = proc.communicate() + + if proc.returncode: + raise Exception(f"Command {' '.join(cmd)!r} failed " + f"with exit code {proc.returncode}: " + f"stdout={version!r} stderr={stderr!r}") + except OSError: + # This is what "no gdb" looks like. There may, however, be other + # errors that manifest this way too. + raise unittest.SkipTest("Couldn't find gdb on the path") + + # Regex to parse: + # 'GNU gdb (GDB; SUSE Linux Enterprise 12) 7.7\n' -> 7.7 + # 'GNU gdb (GDB) Fedora 7.9.1-17.fc22\n' -> 7.9 + # 'GNU gdb 6.1.1 [FreeBSD]\n' -> 6.1 + # 'GNU gdb (GDB) Fedora (7.5.1-37.fc18)\n' -> 7.5 + # 'HP gdb 6.7 for HP Itanium (32 or 64 bit) and target HP-UX 11iv2 and 11iv3.\n' -> 6.7 + match = re.search(r"^(?:GNU|HP) gdb.*?\b(\d+)\.(\d+)", version) + if match is None: + raise Exception("unable to parse GDB version: %r" % version) + return (version, int(match.group(1)), int(match.group(2))) + +gdb_version, gdb_major_version, gdb_minor_version = get_gdb_version() +if gdb_major_version < 7: + raise unittest.SkipTest("gdb versions before 7.0 didn't support python " + "embedding. Saw %s.%s:\n%s" + % (gdb_major_version, gdb_minor_version, + gdb_version)) + +if not sysconfig.is_python_build(): + raise unittest.SkipTest("test_gdb only works on source builds at the moment.") + +if ((sysconfig.get_config_var('PGO_PROF_USE_FLAG') or 'xxx') in + (sysconfig.get_config_var('PY_CORE_CFLAGS') or '')): + raise unittest.SkipTest("test_gdb is not reliable on PGO builds") + +# Location of custom hooks file in a repository checkout. +checkout_hook_path = os.path.join(os.path.dirname(sys.executable), + 'python-gdb.py') + +PYTHONHASHSEED = '123' + + +def cet_protection(): + cflags = sysconfig.get_config_var('CFLAGS') + if not cflags: + return False + flags = cflags.split() + # True if "-mcet -fcf-protection" options are found, but false + # if "-fcf-protection=none" or "-fcf-protection=return" is found. + return (('-mcet' in flags) + and any((flag.startswith('-fcf-protection') + and not flag.endswith(("=none", "=return"))) + for flag in flags)) + +# Control-flow enforcement technology +CET_PROTECTION = cet_protection() + + +def run_gdb(*args, **env_vars): + """Runs gdb in --batch mode with the additional arguments given by *args. + + Returns its (stdout, stderr) decoded from utf-8 using the replace handler. + """ + if env_vars: + env = os.environ.copy() + env.update(env_vars) + else: + env = None + # -nx: Do not execute commands from any .gdbinit initialization files + # (issue #22188) + base_cmd = ('gdb', '--batch', '-nx') + if (gdb_major_version, gdb_minor_version) >= (7, 4): + base_cmd += ('-iex', 'add-auto-load-safe-path ' + checkout_hook_path) + proc = subprocess.Popen(base_cmd + args, + # Redirect stdin to prevent GDB from messing with + # the terminal settings + stdin=subprocess.PIPE, + stdout=subprocess.PIPE, + stderr=subprocess.PIPE, + env=env) + with proc: + out, err = proc.communicate() + return out.decode('utf-8', 'replace'), err.decode('utf-8', 'replace') + +# Verify that "gdb" was built with the embedded python support enabled: +gdbpy_version, _ = run_gdb("--eval-command=python import sys; print(sys.version_info)") +if not gdbpy_version: + raise unittest.SkipTest("gdb not built with embedded python support") + +if "major=2" in gdbpy_version: + raise unittest.SkipTest("gdb built with Python 2") + +# Verify that "gdb" can load our custom hooks, as OS security settings may +# disallow this without a customized .gdbinit. +_, gdbpy_errors = run_gdb('--args', sys.executable) +if "auto-loading has been declined" in gdbpy_errors: + msg = "gdb security settings prevent use of custom hooks: " + raise unittest.SkipTest(msg + gdbpy_errors.rstrip()) + +BREAKPOINT_FN='builtin_id' + + +def setup_module(): + if support.verbose: + print("GDB version %s.%s:" % (gdb_major_version, gdb_minor_version)) + for line in gdb_version.splitlines(): + print(" " * 4 + line) + + +@unittest.skipIf(support.PGO, "not useful for PGO") +class DebuggerTests(unittest.TestCase): + + """Test that the debugger can debug Python.""" + + def get_stack_trace(self, source=None, script=None, + breakpoint=BREAKPOINT_FN, + cmds_after_breakpoint=None, + import_site=False, + ignore_stderr=False): + ''' + Run 'python -c SOURCE' under gdb with a breakpoint. + + Support injecting commands after the breakpoint is reached + + Returns the stdout from gdb + + cmds_after_breakpoint: if provided, a list of strings: gdb commands + ''' + # We use "set breakpoint pending yes" to avoid blocking with a: + # Function "foo" not defined. + # Make breakpoint pending on future shared library load? (y or [n]) + # error, which typically happens python is dynamically linked (the + # breakpoints of interest are to be found in the shared library) + # When this happens, we still get: + # Function "textiowrapper_write" not defined. + # emitted to stderr each time, alas. + + # Initially I had "--eval-command=continue" here, but removed it to + # avoid repeated print breakpoints when traversing hierarchical data + # structures + + # Generate a list of commands in gdb's language: + commands = ['set breakpoint pending yes', + 'break %s' % breakpoint, + + # The tests assume that the first frame of printed + # backtrace will not contain program counter, + # that is however not guaranteed by gdb + # therefore we need to use 'set print address off' to + # make sure the counter is not there. For example: + # #0 in PyObject_Print ... + # is assumed, but sometimes this can be e.g. + # #0 0x00003fffb7dd1798 in PyObject_Print ... + 'set print address off', + + 'run'] + + # GDB as of 7.4 onwards can distinguish between the + # value of a variable at entry vs current value: + # http://sourceware.org/gdb/onlinedocs/gdb/Variables.html + # which leads to the selftests failing with errors like this: + # AssertionError: 'v@entry=()' != '()' + # Disable this: + if (gdb_major_version, gdb_minor_version) >= (7, 4): + commands += ['set print entry-values no'] + + if cmds_after_breakpoint: + if CET_PROTECTION: + # bpo-32962: When Python is compiled with -mcet + # -fcf-protection, function arguments are unusable before + # running the first instruction of the function entry point. + # The 'next' command makes the required first step. + commands += ['next'] + commands += cmds_after_breakpoint + else: + commands += ['backtrace'] + + # print commands + + # Use "commands" to generate the arguments with which to invoke "gdb": + args = ['--eval-command=%s' % cmd for cmd in commands] + args += ["--args", + sys.executable] + args.extend(subprocess._args_from_interpreter_flags()) + + if not import_site: + # -S suppresses the default 'import site' + args += ["-S"] + + if source: + args += ["-c", source] + elif script: + args += [script] + + # Use "args" to invoke gdb, capturing stdout, stderr: + out, err = run_gdb(*args, PYTHONHASHSEED=PYTHONHASHSEED) + + if not ignore_stderr: + for line in err.splitlines(): + print(line, file=sys.stderr) + + # bpo-34007: Sometimes some versions of the shared libraries that + # are part of the traceback are compiled in optimised mode and the + # Program Counter (PC) is not present, not allowing gdb to walk the + # frames back. When this happens, the Python bindings of gdb raise + # an exception, making the test impossible to succeed. + if "PC not saved" in err: + raise unittest.SkipTest("gdb cannot walk the frame object" + " because the Program Counter is" + " not present") + + # bpo-40019: Skip the test if gdb failed to read debug information + # because the Python binary is optimized. + for pattern in ( + '(frame information optimized out)', + 'Unable to read information on python frame', + # gh-91960: On Python built with "clang -Og", gdb gets + # "frame=" for _PyEval_EvalFrameDefault() parameter + '(unable to read python frame information)', + # gh-104736: On Python built with "clang -Og" on ppc64le, + # "py-bt" displays a truncated or not traceback, but "where" + # logs this error message: + 'Backtrace stopped: frame did not save the PC', + # gh-104736: When "bt" command displays something like: + # "#1 0x0000000000000000 in ?? ()", the traceback is likely + # truncated or wrong. + ' ?? ()', + ): + if pattern in out: + raise unittest.SkipTest(f"{pattern!r} found in gdb output") + + return out + + def get_gdb_repr(self, source, + cmds_after_breakpoint=None, + import_site=False): + # Given an input python source representation of data, + # run "python -c'id(DATA)'" under gdb with a breakpoint on + # builtin_id and scrape out gdb's representation of the "op" + # parameter, and verify that the gdb displays the same string + # + # Verify that the gdb displays the expected string + # + # For a nested structure, the first time we hit the breakpoint will + # give us the top-level structure + + # NOTE: avoid decoding too much of the traceback as some + # undecodable characters may lurk there in optimized mode + # (issue #19743). + cmds_after_breakpoint = cmds_after_breakpoint or ["backtrace 1"] + gdb_output = self.get_stack_trace(source, breakpoint=BREAKPOINT_FN, + cmds_after_breakpoint=cmds_after_breakpoint, + import_site=import_site) + # gdb can insert additional '\n' and space characters in various places + # in its output, depending on the width of the terminal it's connected + # to (using its "wrap_here" function) + m = re.search( + # Match '#0 builtin_id(self=..., v=...)' + r'#0\s+builtin_id\s+\(self\=.*,\s+v=\s*(.*?)?\)' + # Match ' at Python/bltinmodule.c'. + # bpo-38239: builtin_id() is defined in Python/bltinmodule.c, + # but accept any "Directory\file.c" to support Link Time + # Optimization (LTO). + r'\s+at\s+\S*[A-Za-z]+/[A-Za-z0-9_-]+\.c', + gdb_output, re.DOTALL) + if not m: + self.fail('Unexpected gdb output: %r\n%s' % (gdb_output, gdb_output)) + return m.group(1), gdb_output + + def assertEndsWith(self, actual, exp_end): + '''Ensure that the given "actual" string ends with "exp_end"''' + self.assertTrue(actual.endswith(exp_end), + msg='%r did not end with %r' % (actual, exp_end)) + + def assertMultilineMatches(self, actual, pattern): + m = re.match(pattern, actual, re.DOTALL) + if not m: + self.fail(msg='%r did not match %r' % (actual, pattern)) + + def get_sample_script(self): + return os.path.join(os.path.dirname(__file__), 'gdb_sample.py') diff --git a/Makefile.pre.in b/Makefile.pre.in index 0d0ac1315bdc7d..ba8e6349e815de 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -1966,6 +1966,7 @@ TESTSUBDIRS= ctypes/test \ test/test_email \ test/test_email/data \ test/test_future_stmt \ + test/test_gdb \ test/test_import \ test/test_import/data \ test/test_import/data/circular_imports \ diff --git a/Misc/NEWS.d/next/Tests/2023-09-28-12-25-19.gh-issue-109972.GYnwIP.rst b/Misc/NEWS.d/next/Tests/2023-09-28-12-25-19.gh-issue-109972.GYnwIP.rst new file mode 100644 index 00000000000000..7b6007678388b1 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-09-28-12-25-19.gh-issue-109972.GYnwIP.rst @@ -0,0 +1,2 @@ +Split test_gdb.py file into a test_gdb package made of multiple tests, so tests +can now be run in parallel. Patch by Victor Stinner. From f9ac377626ff993569b532e693acf3f08a6c1f43 Mon Sep 17 00:00:00 2001 From: Victor Stinner Date: Wed, 4 Oct 2023 12:58:49 +0200 Subject: [PATCH 011/265] [3.11] Add test.support.busy_retry() (#93770) (#110341) Add test.support.busy_retry() (#93770) Add busy_retry() and sleeping_retry() functions to test.support. (cherry picked from commit 7e9eaad864349d2cfd4c9ffc4453aba03b2cbc16) --- Doc/library/test.rst | 45 +++++++++++ Lib/test/_test_multiprocessing.py | 60 ++++++--------- Lib/test/fork_wait.py | 6 +- Lib/test/support/__init__.py | 76 +++++++++++++++++++ Lib/test/test__xxsubinterpreters.py | 11 ++- Lib/test/test_concurrent_futures/test_init.py | 9 +-- .../test_multiprocessing_main_handling.py | 25 +++--- Lib/test/test_signal.py | 25 +++--- Lib/test/test_ssl.py | 5 +- Lib/test/test_support.py | 12 +-- Lib/test/test_wait3.py | 5 +- Lib/test/test_wait4.py | 5 +- 12 files changed, 185 insertions(+), 99 deletions(-) diff --git a/Doc/library/test.rst b/Doc/library/test.rst index 427953e0077aab..d3eb0ae00a08dc 100644 --- a/Doc/library/test.rst +++ b/Doc/library/test.rst @@ -404,6 +404,51 @@ The :mod:`test.support` module defines the following constants: The :mod:`test.support` module defines the following functions: +.. function:: busy_retry(timeout, err_msg=None, /, *, error=True) + + Run the loop body until ``break`` stops the loop. + + After *timeout* seconds, raise an :exc:`AssertionError` if *error* is true, + or just stop the loop if *error* is false. + + Example:: + + for _ in support.busy_retry(support.SHORT_TIMEOUT): + if check(): + break + + Example of error=False usage:: + + for _ in support.busy_retry(support.SHORT_TIMEOUT, error=False): + if check(): + break + else: + raise RuntimeError('my custom error') + +.. function:: sleeping_retry(timeout, err_msg=None, /, *, init_delay=0.010, max_delay=1.0, error=True) + + Wait strategy that applies exponential backoff. + + Run the loop body until ``break`` stops the loop. Sleep at each loop + iteration, but not at the first iteration. The sleep delay is doubled at + each iteration (up to *max_delay* seconds). + + See :func:`busy_retry` documentation for the parameters usage. + + Example raising an exception after SHORT_TIMEOUT seconds:: + + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if check(): + break + + Example of error=False usage:: + + for _ in support.sleeping_retry(support.SHORT_TIMEOUT, error=False): + if check(): + break + else: + raise RuntimeError('my custom error') + .. function:: is_resource_enabled(resource) Return ``True`` if *resource* is enabled and available. The list of diff --git a/Lib/test/_test_multiprocessing.py b/Lib/test/_test_multiprocessing.py index 6249062639b8d4..6459757eb93d51 100644 --- a/Lib/test/_test_multiprocessing.py +++ b/Lib/test/_test_multiprocessing.py @@ -4376,18 +4376,13 @@ def test_shared_memory_cleaned_after_process_termination(self): p.terminate() p.wait() - deadline = time.monotonic() + support.LONG_TIMEOUT - t = 0.1 - while time.monotonic() < deadline: - time.sleep(t) - t = min(t*2, 5) + err_msg = ("A SharedMemory segment was leaked after " + "a process was abruptly terminated") + for _ in support.sleeping_retry(support.LONG_TIMEOUT, err_msg): try: smm = shared_memory.SharedMemory(name, create=False) except FileNotFoundError: break - else: - raise AssertionError("A SharedMemory segment was leaked after" - " a process was abruptly terminated.") if os.name == 'posix': # Without this line it was raising warnings like: @@ -5458,9 +5453,10 @@ def create_and_register_resource(rtype): p.terminate() p.wait() - deadline = time.monotonic() + support.LONG_TIMEOUT - while time.monotonic() < deadline: - time.sleep(.5) + err_msg = (f"A {rtype} resource was leaked after a process was " + f"abruptly terminated") + for _ in support.sleeping_retry(support.SHORT_TIMEOUT, + err_msg): try: _resource_unlink(name2, rtype) except OSError as e: @@ -5468,10 +5464,7 @@ def create_and_register_resource(rtype): # EINVAL self.assertIn(e.errno, (errno.ENOENT, errno.EINVAL)) break - else: - raise AssertionError( - f"A {rtype} resource was leaked after a process was " - f"abruptly terminated.") + err = p.stderr.read().decode('utf-8') p.stderr.close() expected = ('resource_tracker: There appear to be 2 leaked {} ' @@ -5707,18 +5700,17 @@ def wait_proc_exit(self): # but this can take a bit on slow machines, so wait a few seconds # if there are other children too (see #17395). join_process(self.proc) + start_time = time.monotonic() - t = 0.01 - while len(multiprocessing.active_children()) > 1: - time.sleep(t) - t *= 2 - dt = time.monotonic() - start_time - if dt >= 5.0: - test.support.environment_altered = True - support.print_warning(f"multiprocessing.Manager still has " - f"{multiprocessing.active_children()} " - f"active children after {dt} seconds") + for _ in support.sleeping_retry(5.0, error=False): + if len(multiprocessing.active_children()) <= 1: break + else: + dt = time.monotonic() - start_time + support.environment_altered = True + support.print_warning(f"multiprocessing.Manager still has " + f"{multiprocessing.active_children()} " + f"active children after {dt:.1f} seconds") def run_worker(self, worker, obj): self.proc = multiprocessing.Process(target=worker, args=(obj, )) @@ -6031,17 +6023,15 @@ def tearDownClass(cls): # but this can take a bit on slow machines, so wait a few seconds # if there are other children too (see #17395) start_time = time.monotonic() - t = 0.01 - while len(multiprocessing.active_children()) > 1: - time.sleep(t) - t *= 2 - dt = time.monotonic() - start_time - if dt >= 5.0: - test.support.environment_altered = True - support.print_warning(f"multiprocessing.Manager still has " - f"{multiprocessing.active_children()} " - f"active children after {dt} seconds") + for _ in support.sleeping_retry(5.0, error=False): + if len(multiprocessing.active_children()) <= 1: break + else: + dt = time.monotonic() - start_time + support.environment_altered = True + support.print_warning(f"multiprocessing.Manager still has " + f"{multiprocessing.active_children()} " + f"active children after {dt:.1f} seconds") gc.collect() # do garbage collection if cls.manager._number_of_objects() != 0: diff --git a/Lib/test/fork_wait.py b/Lib/test/fork_wait.py index 4d3dbd8e83f5a9..c565f593559483 100644 --- a/Lib/test/fork_wait.py +++ b/Lib/test/fork_wait.py @@ -54,10 +54,8 @@ def test_wait(self): self.threads.append(thread) # busy-loop to wait for threads - deadline = time.monotonic() + support.SHORT_TIMEOUT - while len(self.alive) < NUM_THREADS: - time.sleep(0.1) - if deadline < time.monotonic(): + for _ in support.sleeping_retry(support.SHORT_TIMEOUT, error=False): + if len(self.alive) >= NUM_THREADS: break a = sorted(self.alive.keys()) diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index 2e6518cf241f14..6ee525c061f624 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -2324,6 +2324,82 @@ def requires_venv_with_pip(): return unittest.skipUnless(ctypes, 'venv: pip requires ctypes') +def busy_retry(timeout, err_msg=None, /, *, error=True): + """ + Run the loop body until "break" stops the loop. + + After *timeout* seconds, raise an AssertionError if *error* is true, + or just stop if *error is false. + + Example: + + for _ in support.busy_retry(support.SHORT_TIMEOUT): + if check(): + break + + Example of error=False usage: + + for _ in support.busy_retry(support.SHORT_TIMEOUT, error=False): + if check(): + break + else: + raise RuntimeError('my custom error') + + """ + if timeout <= 0: + raise ValueError("timeout must be greater than zero") + + start_time = time.monotonic() + deadline = start_time + timeout + + while True: + yield + + if time.monotonic() >= deadline: + break + + if error: + dt = time.monotonic() - start_time + msg = f"timeout ({dt:.1f} seconds)" + if err_msg: + msg = f"{msg}: {err_msg}" + raise AssertionError(msg) + + +def sleeping_retry(timeout, err_msg=None, /, + *, init_delay=0.010, max_delay=1.0, error=True): + """ + Wait strategy that applies exponential backoff. + + Run the loop body until "break" stops the loop. Sleep at each loop + iteration, but not at the first iteration. The sleep delay is doubled at + each iteration (up to *max_delay* seconds). + + See busy_retry() documentation for the parameters usage. + + Example raising an exception after SHORT_TIMEOUT seconds: + + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if check(): + break + + Example of error=False usage: + + for _ in support.sleeping_retry(support.SHORT_TIMEOUT, error=False): + if check(): + break + else: + raise RuntimeError('my custom error') + """ + + delay = init_delay + for _ in busy_retry(timeout, err_msg, error=error): + yield + + time.sleep(delay) + delay = min(delay * 2, max_delay) + + @contextlib.contextmanager def adjust_int_max_str_digits(max_digits): """Temporarily change the integer string conversion length limit.""" diff --git a/Lib/test/test__xxsubinterpreters.py b/Lib/test/test__xxsubinterpreters.py index 5d0ed9ea14ac7f..f20aae8e21c66f 100644 --- a/Lib/test/test__xxsubinterpreters.py +++ b/Lib/test/test__xxsubinterpreters.py @@ -45,12 +45,11 @@ def _wait_for_interp_to_run(interp, timeout=None): # run subinterpreter eariler than the main thread in multiprocess. if timeout is None: timeout = support.SHORT_TIMEOUT - start_time = time.monotonic() - deadline = start_time + timeout - while not interpreters.is_running(interp): - if time.monotonic() > deadline: - raise RuntimeError('interp is not running') - time.sleep(0.010) + for _ in support.sleeping_retry(timeout, error=False): + if interpreters.is_running(interp): + break + else: + raise RuntimeError('interp is not running') @contextlib.contextmanager diff --git a/Lib/test/test_concurrent_futures/test_init.py b/Lib/test/test_concurrent_futures/test_init.py index 138d6ee545fd92..ebab9437b9c4e1 100644 --- a/Lib/test/test_concurrent_futures/test_init.py +++ b/Lib/test/test_concurrent_futures/test_init.py @@ -78,11 +78,10 @@ def test_initializer(self): future.result() # At some point, the executor should break - t1 = time.monotonic() - while not self.executor._broken: - if time.monotonic() - t1 > 5: - self.fail("executor not broken after 5 s.") - time.sleep(0.01) + for _ in support.sleeping_retry(5, "executor not broken"): + if self.executor._broken: + break + # ... and from this point submit() is guaranteed to fail with self.assertRaises(BrokenExecutor): self.executor.submit(get_init_status) diff --git a/Lib/test/test_multiprocessing_main_handling.py b/Lib/test/test_multiprocessing_main_handling.py index 510d8d3a7597e1..35e9cd64fa6c01 100644 --- a/Lib/test/test_multiprocessing_main_handling.py +++ b/Lib/test/test_multiprocessing_main_handling.py @@ -40,6 +40,7 @@ import sys import time from multiprocessing import Pool, set_start_method +from test import support # We use this __main__ defined function in the map call below in order to # check that multiprocessing in correctly running the unguarded @@ -59,13 +60,11 @@ def f(x): results = [] with Pool(5) as pool: pool.map_async(f, [1, 2, 3], callback=results.extend) - start_time = time.monotonic() - while not results: - time.sleep(0.05) - # up to 1 min to report the results - dt = time.monotonic() - start_time - if dt > 60.0: - raise RuntimeError("Timed out waiting for results (%.1f sec)" % dt) + + # up to 1 min to report the results + for _ in support.sleeping_retry(60, "Timed out waiting for results"): + if results: + break results.sort() print(start_method, "->", results) @@ -86,19 +85,17 @@ def f(x): import sys import time from multiprocessing import Pool, set_start_method +from test import support start_method = sys.argv[1] set_start_method(start_method) results = [] with Pool(5) as pool: pool.map_async(int, [1, 4, 9], callback=results.extend) - start_time = time.monotonic() - while not results: - time.sleep(0.05) - # up to 1 min to report the results - dt = time.monotonic() - start_time - if dt > 60.0: - raise RuntimeError("Timed out waiting for results (%.1f sec)" % dt) + # up to 1 min to report the results + for _ in support.sleeping_retry(60, "Timed out waiting for results"): + if results: + break results.sort() print(start_method, "->", results) diff --git a/Lib/test/test_signal.py b/Lib/test/test_signal.py index f6632896100a66..a00c7e539400d9 100644 --- a/Lib/test/test_signal.py +++ b/Lib/test/test_signal.py @@ -813,13 +813,14 @@ def test_itimer_virtual(self): signal.signal(signal.SIGVTALRM, self.sig_vtalrm) signal.setitimer(self.itimer, 0.3, 0.2) - start_time = time.monotonic() - while time.monotonic() - start_time < 60.0: + for _ in support.busy_retry(60.0, error=False): # use up some virtual time by doing real work _ = pow(12345, 67890, 10000019) if signal.getitimer(self.itimer) == (0.0, 0.0): - break # sig_vtalrm handler stopped this itimer - else: # Issue 8424 + # sig_vtalrm handler stopped this itimer + break + else: + # bpo-8424 self.skipTest("timeout: likely cause: machine too slow or load too " "high") @@ -833,13 +834,14 @@ def test_itimer_prof(self): signal.signal(signal.SIGPROF, self.sig_prof) signal.setitimer(self.itimer, 0.2, 0.2) - start_time = time.monotonic() - while time.monotonic() - start_time < 60.0: + for _ in support.busy_retry(60.0, error=False): # do some work _ = pow(12345, 67890, 10000019) if signal.getitimer(self.itimer) == (0.0, 0.0): - break # sig_prof handler stopped this itimer - else: # Issue 8424 + # sig_prof handler stopped this itimer + break + else: + # bpo-8424 self.skipTest("timeout: likely cause: machine too slow or load too " "high") @@ -1308,8 +1310,6 @@ def handler(signum, frame): self.setsig(signal.SIGALRM, handler) # for ITIMER_REAL expected_sigs = 0 - deadline = time.monotonic() + support.SHORT_TIMEOUT - while expected_sigs < N: # Hopefully the SIGALRM will be received somewhere during # initial processing of SIGUSR1. @@ -1318,8 +1318,9 @@ def handler(signum, frame): expected_sigs += 2 # Wait for handlers to run to avoid signal coalescing - while len(sigs) < expected_sigs and time.monotonic() < deadline: - time.sleep(1e-5) + for _ in support.sleeping_retry(support.SHORT_TIMEOUT, error=False): + if len(sigs) >= expected_sigs: + break # All ITIMER_REAL signals should have been delivered to the # Python handler diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py index 965c2728914b50..1bfec237484124 100644 --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -2279,11 +2279,8 @@ def ssl_io_loop(self, sock, incoming, outgoing, func, *args, **kwargs): # A simple IO loop. Call func(*args) depending on the error we get # (WANT_READ or WANT_WRITE) move data between the socket and the BIOs. timeout = kwargs.get('timeout', support.SHORT_TIMEOUT) - deadline = time.monotonic() + timeout count = 0 - while True: - if time.monotonic() > deadline: - self.fail("timeout") + for _ in support.busy_retry(timeout): errno = None count += 1 try: diff --git a/Lib/test/test_support.py b/Lib/test/test_support.py index 01ba88ce42c5f6..fec96bef4bbd22 100644 --- a/Lib/test/test_support.py +++ b/Lib/test/test_support.py @@ -10,7 +10,6 @@ import sysconfig import tempfile import textwrap -import time import unittest import warnings @@ -462,18 +461,12 @@ def test_reap_children(self): # child process: do nothing, just exit os._exit(0) - t0 = time.monotonic() - deadline = time.monotonic() + support.SHORT_TIMEOUT - was_altered = support.environment_altered try: support.environment_altered = False stderr = io.StringIO() - while True: - if time.monotonic() > deadline: - self.fail("timeout") - + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): with support.swap_attr(support.print_warning, 'orig_stderr', stderr): support.reap_children() @@ -482,9 +475,6 @@ def test_reap_children(self): if support.environment_altered: break - # loop until the child process completed - time.sleep(0.100) - msg = "Warning -- reap_children() reaped child process %s" % pid self.assertIn(msg, stderr.getvalue()) self.assertTrue(support.environment_altered) diff --git a/Lib/test/test_wait3.py b/Lib/test/test_wait3.py index 4ec7690ac19bbf..15d66ae825abf5 100644 --- a/Lib/test/test_wait3.py +++ b/Lib/test/test_wait3.py @@ -4,7 +4,6 @@ import os import subprocess import sys -import time import unittest from test.fork_wait import ForkWait from test import support @@ -20,14 +19,12 @@ def wait_impl(self, cpid, *, exitcode): # This many iterations can be required, since some previously run # tests (e.g. test_ctypes) could have spawned a lot of children # very quickly. - deadline = time.monotonic() + support.SHORT_TIMEOUT - while time.monotonic() <= deadline: + for _ in support.sleeping_retry(support.SHORT_TIMEOUT, error=False): # wait3() shouldn't hang, but some of the buildbots seem to hang # in the forking tests. This is an attempt to fix the problem. spid, status, rusage = os.wait3(os.WNOHANG) if spid == cpid: break - time.sleep(0.1) self.assertEqual(spid, cpid) self.assertEqual(os.waitstatus_to_exitcode(status), exitcode) diff --git a/Lib/test/test_wait4.py b/Lib/test/test_wait4.py index 24f1aaec60c56b..f66c0db1c20e6e 100644 --- a/Lib/test/test_wait4.py +++ b/Lib/test/test_wait4.py @@ -2,7 +2,6 @@ """ import os -import time import sys import unittest from test.fork_wait import ForkWait @@ -22,14 +21,12 @@ def wait_impl(self, cpid, *, exitcode): # Issue #11185: wait4 is broken on AIX and will always return 0 # with WNOHANG. option = 0 - deadline = time.monotonic() + support.SHORT_TIMEOUT - while time.monotonic() <= deadline: + for _ in support.sleeping_retry(support.SHORT_TIMEOUT, error=False): # wait4() shouldn't hang, but some of the buildbots seem to hang # in the forking tests. This is an attempt to fix the problem. spid, status, rusage = os.wait4(cpid, option) if spid == cpid: break - time.sleep(0.1) self.assertEqual(spid, cpid) self.assertEqual(os.waitstatus_to_exitcode(status), exitcode) self.assertTrue(rusage) From 497c8c4520e563e6b21f6707adb617542798e125 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 4 Oct 2023 04:09:43 -0700 Subject: [PATCH 012/265] [3.11] gh-108927: Fix test_import + test_importlib + test_unittest problem (GH-108929) (#110347) gh-108927: Fix test_import + test_importlib + test_unittest problem (GH-108929) (cherry picked from commit 3f89b257639dd817a32079da2ae2c4436b8e82eb) Co-authored-by: Nikita Sobolev --- Lib/unittest/test/test_discovery.py | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/Lib/unittest/test/test_discovery.py b/Lib/unittest/test/test_discovery.py index 3b58786ec16a10..c39ad2e3acfb29 100644 --- a/Lib/unittest/test/test_discovery.py +++ b/Lib/unittest/test/test_discovery.py @@ -6,7 +6,6 @@ import pickle from test import support from test.support import import_helper -import test.test_importlib.util import unittest import unittest.mock @@ -826,6 +825,8 @@ def restore(): 'as dotted module names') def test_discovery_failed_discovery(self): + from test.test_importlib import util + loader = unittest.TestLoader() package = types.ModuleType('package') @@ -837,7 +838,7 @@ def _import(packagename, *args, **kwargs): # Since loader.discover() can modify sys.path, restore it when done. with import_helper.DirsOnSysPath(): # Make sure to remove 'package' from sys.modules when done. - with test.test_importlib.util.uncache('package'): + with util.uncache('package'): with self.assertRaises(TypeError) as cm: loader.discover('package') self.assertEqual(str(cm.exception), From d507493078ccbc9bf75d0643716cf7b355e375f0 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 4 Oct 2023 04:17:38 -0700 Subject: [PATCH 013/265] [3.11] gh-110332: Remove mentions of `random.WichmannHill` from `test_zlib` (GH-110334) (#110348) gh-110332: Remove mentions of `random.WichmannHill` from `test_zlib` (GH-110334) (cherry picked from commit e9f2352b7b7503519790ee6f51c2e298cf390e75) Co-authored-by: Nikita Sobolev --- Lib/test/test_zlib.py | 13 +------------ 1 file changed, 1 insertion(+), 12 deletions(-) diff --git a/Lib/test/test_zlib.py b/Lib/test/test_zlib.py index f20aad051da960..ff5ce11b965b7c 100644 --- a/Lib/test/test_zlib.py +++ b/Lib/test/test_zlib.py @@ -516,18 +516,7 @@ def test_odd_flush(self): # Try 17K of data # generate random data stream - try: - # In 2.3 and later, WichmannHill is the RNG of the bug report - gen = random.WichmannHill() - except AttributeError: - try: - # 2.2 called it Random - gen = random.Random() - except AttributeError: - # others might simply have a single RNG - gen = random - gen.seed(1) - data = gen.randbytes(17 * 1024) + data = random.randbytes(17 * 1024) # compress, sync-flush, and decompress first = co.compress(data) From aa8d3db6f201470b55debfe8a301317f39268ff1 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 4 Oct 2023 05:03:28 -0700 Subject: [PATCH 014/265] [3.11] [3.12] gh-109972: Enhance test_gdb (GH-110026) (GH-110351) (#110354) [3.12] gh-109972: Enhance test_gdb (GH-110026) (GH-110351) * gh-109972: Enhance test_gdb (GH-110026) * Split test_pycfunction.py: add test_cfunction_full.py. Split the function into the following 6 functions. In verbose mode, these "pycfunction" tests now log each tested call. * test_pycfunction_noargs() * test_pycfunction_o() * test_pycfunction_varargs() * test_pycfunction_varargs_keywords() * test_pycfunction_fastcall() * test_pycfunction_fastcall_keywords() * Move get_gdb_repr() to PrettyPrintTests. * Replace DebuggerTests.get_sample_script() with SAMPLE_SCRIPT. * Rename checkout_hook_path to CHECKOUT_HOOK_PATH. * Rename gdb_version to GDB_VERSION_TEXT. * Replace (gdb_major_version, gdb_minor_version) with GDB_VERSION. * run_gdb() uses "backslashreplace" error handler instead of "replace". * Add check_gdb() function to util.py. * Enhance support.check_cflags_pgo(): check also for sysconfig PGO_PROF_USE_FLAG (if available) in compiler flags. * Move some SkipTest checks to test_gdb/__init__.py. * Elaborate why gdb cannot be tested on Windows: gdb doesn't support PDB debug symbol files. (cherry picked from commit 757cbd4f29c9e89b38b975e0463dc8ed331b2515) * gh-104736: Fix test_gdb tests on ppc64le with clang (GH-109360) Fix test_gdb on Python built with LLVM clang 16 on Linux ppc64le (ex: Fedora 38). Search patterns in gdb "bt" command output to detect when gdb fails to retrieve the traceback. For example, skip a test if "Backtrace stopped: frame did not save the PC" is found. (cherry picked from commit 44d9a71ea246e7c3fb478d9be62c16914be6c545) * gh-110166: Fix gdb CFunctionFullTests on ppc64le clang build (GH-110331) CFunctionFullTests now also runs "bt" command before "py-bt-full", similar to CFunctionTests which also runs "bt" command before "py-bt". So test_gdb can skip the test if patterns like "?? ()" are found in the gdb output. (cherry picked from commit bbce8bd05dd25c6e74487940fa1977485b52baf4) Co-authored-by: Victor Stinner (cherry picked from commit 1de9406f9136e3952b849487f0151be3c669a3ea) Co-authored-by: Victor Stinner --- Lib/test/support/__init__.py | 7 +- Lib/test/test_gdb/__init__.py | 24 +- Lib/test/test_gdb/test_backtrace.py | 6 +- Lib/test/test_gdb/test_cfunction.py | 114 ++++---- Lib/test/test_gdb/test_cfunction_full.py | 36 +++ Lib/test/test_gdb/test_misc.py | 20 +- Lib/test/test_gdb/test_pretty_print.py | 54 +++- Lib/test/test_gdb/util.py | 256 ++++++++---------- ...-09-13-05-58-09.gh-issue-104736.lA25Fu.rst | 4 + 9 files changed, 303 insertions(+), 218 deletions(-) create mode 100644 Lib/test/test_gdb/test_cfunction_full.py create mode 100644 Misc/NEWS.d/next/Tests/2023-09-13-05-58-09.gh-issue-104736.lA25Fu.rst diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index 6ee525c061f624..fbbd59d719d193 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -769,14 +769,17 @@ def check_cflags_pgo(): # Check if Python was built with ./configure --enable-optimizations: # with Profile Guided Optimization (PGO). cflags_nodist = sysconfig.get_config_var('PY_CFLAGS_NODIST') or '' - pgo_options = ( + pgo_options = [ # GCC '-fprofile-use', # clang: -fprofile-instr-use=code.profclangd '-fprofile-instr-use', # ICC "-prof-use", - ) + ] + PGO_PROF_USE_FLAG = sysconfig.get_config_var('PGO_PROF_USE_FLAG') + if PGO_PROF_USE_FLAG: + pgo_options.append(PGO_PROF_USE_FLAG) return any(option in cflags_nodist for option in pgo_options) diff --git a/Lib/test/test_gdb/__init__.py b/Lib/test/test_gdb/__init__.py index 0261f59adf54bd..d74075e456792d 100644 --- a/Lib/test/test_gdb/__init__.py +++ b/Lib/test/test_gdb/__init__.py @@ -4,7 +4,27 @@ # Lib/test/test_jit_gdb.py import os -from test.support import load_package_tests +import sysconfig +import unittest +from test import support + + +MS_WINDOWS = (os.name == 'nt') +if MS_WINDOWS: + # On Windows, Python is usually built by MSVC. Passing /p:DebugSymbols=true + # option to MSBuild produces PDB debug symbols, but gdb doesn't support PDB + # debug symbol files. + raise unittest.SkipTest("test_gdb doesn't work on Windows") + +if support.PGO: + raise unittest.SkipTest("test_gdb is not useful for PGO") + +if not sysconfig.is_python_build(): + raise unittest.SkipTest("test_gdb only works on source builds at the moment.") + +if support.check_cflags_pgo(): + raise unittest.SkipTest("test_gdb is not reliable on PGO builds") + def load_tests(*args): - return load_package_tests(os.path.dirname(__file__), *args) + return support.load_package_tests(os.path.dirname(__file__), *args) diff --git a/Lib/test/test_gdb/test_backtrace.py b/Lib/test/test_gdb/test_backtrace.py index 15cbcf169ab9e3..c41e7cb7c210de 100644 --- a/Lib/test/test_gdb/test_backtrace.py +++ b/Lib/test/test_gdb/test_backtrace.py @@ -3,7 +3,7 @@ from test import support from test.support import python_is_optimized -from .util import setup_module, DebuggerTests, CET_PROTECTION +from .util import setup_module, DebuggerTests, CET_PROTECTION, SAMPLE_SCRIPT def setUpModule(): @@ -15,7 +15,7 @@ class PyBtTests(DebuggerTests): "Python was compiled with optimizations") def test_bt(self): 'Verify that the "py-bt" command works' - bt = self.get_stack_trace(script=self.get_sample_script(), + bt = self.get_stack_trace(script=SAMPLE_SCRIPT, cmds_after_breakpoint=['py-bt']) self.assertMultilineMatches(bt, r'''^.* @@ -35,7 +35,7 @@ def test_bt(self): "Python was compiled with optimizations") def test_bt_full(self): 'Verify that the "py-bt-full" command works' - bt = self.get_stack_trace(script=self.get_sample_script(), + bt = self.get_stack_trace(script=SAMPLE_SCRIPT, cmds_after_breakpoint=['py-bt-full']) self.assertMultilineMatches(bt, r'''^.* diff --git a/Lib/test/test_gdb/test_cfunction.py b/Lib/test/test_gdb/test_cfunction.py index 55796d021511e1..0a62014923e61f 100644 --- a/Lib/test/test_gdb/test_cfunction.py +++ b/Lib/test/test_gdb/test_cfunction.py @@ -1,8 +1,6 @@ -import re import textwrap import unittest from test import support -from test.support import python_is_optimized from .util import setup_module, DebuggerTests @@ -11,10 +9,22 @@ def setUpModule(): setup_module() -@unittest.skipIf(python_is_optimized(), +@unittest.skipIf(support.python_is_optimized(), "Python was compiled with optimizations") @support.requires_resource('cpu') class CFunctionTests(DebuggerTests): + def check(self, func_name, cmd): + # Verify with "py-bt": + gdb_output = self.get_stack_trace( + cmd, + breakpoint=func_name, + cmds_after_breakpoint=['bt', 'py-bt'], + # bpo-45207: Ignore 'Function "meth_varargs" not + # defined.' message in stderr. + ignore_stderr=True, + ) + self.assertIn(f'\n.*") @@ -167,7 +167,7 @@ class PyLocalsTests(DebuggerTests): @unittest.skipIf(python_is_optimized(), "Python was compiled with optimizations") def test_basic_command(self): - bt = self.get_stack_trace(script=self.get_sample_script(), + bt = self.get_stack_trace(script=SAMPLE_SCRIPT, cmds_after_breakpoint=['py-up', 'py-locals']) self.assertMultilineMatches(bt, r".*\nargs = \(1, 2, 3\)\n.*") @@ -176,7 +176,7 @@ def test_basic_command(self): @unittest.skipIf(python_is_optimized(), "Python was compiled with optimizations") def test_locals_after_up(self): - bt = self.get_stack_trace(script=self.get_sample_script(), + bt = self.get_stack_trace(script=SAMPLE_SCRIPT, cmds_after_breakpoint=['py-up', 'py-up', 'py-locals']) self.assertMultilineMatches(bt, r'''^.* diff --git a/Lib/test/test_gdb/test_pretty_print.py b/Lib/test/test_gdb/test_pretty_print.py index e31dc66f29684a..dfc77d65ab16a4 100644 --- a/Lib/test/test_gdb/test_pretty_print.py +++ b/Lib/test/test_gdb/test_pretty_print.py @@ -3,7 +3,7 @@ from test import support from .util import ( - BREAKPOINT_FN, gdb_major_version, gdb_minor_version, + BREAKPOINT_FN, GDB_VERSION, run_gdb, setup_module, DebuggerTests) @@ -12,6 +12,42 @@ def setUpModule(): class PrettyPrintTests(DebuggerTests): + def get_gdb_repr(self, source, + cmds_after_breakpoint=None, + import_site=False): + # Given an input python source representation of data, + # run "python -c'id(DATA)'" under gdb with a breakpoint on + # builtin_id and scrape out gdb's representation of the "op" + # parameter, and verify that the gdb displays the same string + # + # Verify that the gdb displays the expected string + # + # For a nested structure, the first time we hit the breakpoint will + # give us the top-level structure + + # NOTE: avoid decoding too much of the traceback as some + # undecodable characters may lurk there in optimized mode + # (issue #19743). + cmds_after_breakpoint = cmds_after_breakpoint or ["backtrace 1"] + gdb_output = self.get_stack_trace(source, breakpoint=BREAKPOINT_FN, + cmds_after_breakpoint=cmds_after_breakpoint, + import_site=import_site) + # gdb can insert additional '\n' and space characters in various places + # in its output, depending on the width of the terminal it's connected + # to (using its "wrap_here" function) + m = re.search( + # Match '#0 builtin_id(self=..., v=...)' + r'#0\s+builtin_id\s+\(self\=.*,\s+v=\s*(.*?)?\)' + # Match ' at Python/bltinmodule.c'. + # bpo-38239: builtin_id() is defined in Python/bltinmodule.c, + # but accept any "Directory\file.c" to support Link Time + # Optimization (LTO). + r'\s+at\s+\S*[A-Za-z]+/[A-Za-z0-9_-]+\.c', + gdb_output, re.DOTALL) + if not m: + self.fail('Unexpected gdb output: %r\n%s' % (gdb_output, gdb_output)) + return m.group(1), gdb_output + def test_getting_backtrace(self): gdb_output = self.get_stack_trace('id(42)') self.assertTrue(BREAKPOINT_FN in gdb_output) @@ -75,15 +111,17 @@ def test_strings(self): # as GDB might have been linked against a different version # of Python with a different encoding and coercion policy # with respect to PEP 538 and PEP 540. - out, err = run_gdb( + stdout, stderr = run_gdb( '--eval-command', 'python import locale; print(locale.getpreferredencoding())') - encoding = out.rstrip() - if err or not encoding: + encoding = stdout + if stderr or not encoding: raise RuntimeError( - f'unable to determine the preferred encoding ' - f'of embedded Python in GDB: {err}') + f'unable to determine the Python locale preferred encoding ' + f'of embedded Python in GDB\n' + f'stdout={stdout!r}\n' + f'stderr={stderr!r}') def check_repr(text): try: @@ -122,7 +160,7 @@ def test_tuples(self): @support.requires_resource('cpu') def test_sets(self): 'Verify the pretty-printing of sets' - if (gdb_major_version, gdb_minor_version) < (7, 3): + if GDB_VERSION < (7, 3): self.skipTest("pretty-printing of sets needs gdb 7.3 or later") self.assertGdbRepr(set(), "set()") self.assertGdbRepr(set(['a']), "{'a'}") @@ -141,7 +179,7 @@ def test_sets(self): @support.requires_resource('cpu') def test_frozensets(self): 'Verify the pretty-printing of frozensets' - if (gdb_major_version, gdb_minor_version) < (7, 3): + if GDB_VERSION < (7, 3): self.skipTest("pretty-printing of frozensets needs gdb 7.3 or later") self.assertGdbRepr(frozenset(), "frozenset()") self.assertGdbRepr(frozenset(['a']), "frozenset({'a'})") diff --git a/Lib/test/test_gdb/util.py b/Lib/test/test_gdb/util.py index 30beb4e14285c7..7f4e3cba3534bd 100644 --- a/Lib/test/test_gdb/util.py +++ b/Lib/test/test_gdb/util.py @@ -1,5 +1,6 @@ import os import re +import shlex import subprocess import sys import sysconfig @@ -7,29 +8,74 @@ from test import support -MS_WINDOWS = (sys.platform == 'win32') -if MS_WINDOWS: - raise unittest.SkipTest("test_gdb doesn't work on Windows") +# Location of custom hooks file in a repository checkout. +CHECKOUT_HOOK_PATH = os.path.join(os.path.dirname(sys.executable), + 'python-gdb.py') + +SAMPLE_SCRIPT = os.path.join(os.path.dirname(__file__), 'gdb_sample.py') +BREAKPOINT_FN = 'builtin_id' + +PYTHONHASHSEED = '123' + + +def clean_environment(): + # Remove PYTHON* environment variables such as PYTHONHOME + return {name: value for name, value in os.environ.items() + if not name.startswith('PYTHON')} + + +# Temporary value until it's initialized by get_gdb_version() below +GDB_VERSION = (0, 0) + +def run_gdb(*args, exitcode=0, **env_vars): + """Runs gdb in --batch mode with the additional arguments given by *args. + + Returns its (stdout, stderr) decoded from utf-8 using the replace handler. + """ + env = clean_environment() + if env_vars: + env.update(env_vars) + + cmd = ['gdb', + # Batch mode: Exit after processing all the command files + # specified with -x/--command + '--batch', + # -nx: Do not execute commands from any .gdbinit initialization + # files (gh-66384) + '-nx'] + if GDB_VERSION >= (7, 4): + cmd.extend(('--init-eval-command', + f'add-auto-load-safe-path {CHECKOUT_HOOK_PATH}')) + cmd.extend(args) + + proc = subprocess.run( + cmd, + # Redirect stdin to prevent gdb from messing with the terminal settings + stdin=subprocess.PIPE, + stdout=subprocess.PIPE, + stderr=subprocess.PIPE, + encoding="utf8", errors="backslashreplace", + env=env) + + stdout = proc.stdout + stderr = proc.stderr + if proc.returncode != exitcode: + cmd_text = shlex.join(cmd) + raise Exception(f"{cmd_text} failed with exit code {proc.returncode}, " + f"expected exit code {exitcode}:\n" + f"stdout={stdout!r}\n" + f"stderr={stderr!r}") + + return (stdout, stderr) def get_gdb_version(): try: - cmd = ["gdb", "-nx", "--version"] - proc = subprocess.Popen(cmd, - stdout=subprocess.PIPE, - stderr=subprocess.PIPE, - universal_newlines=True) - with proc: - version, stderr = proc.communicate() - - if proc.returncode: - raise Exception(f"Command {' '.join(cmd)!r} failed " - f"with exit code {proc.returncode}: " - f"stdout={version!r} stderr={stderr!r}") + stdout, stderr = run_gdb('--version') except OSError: # This is what "no gdb" looks like. There may, however, be other # errors that manifest this way too. - raise unittest.SkipTest("Couldn't find gdb on the path") + raise unittest.SkipTest("Couldn't find gdb program on the path") # Regex to parse: # 'GNU gdb (GDB; SUSE Linux Enterprise 12) 7.7\n' -> 7.7 @@ -37,32 +83,48 @@ def get_gdb_version(): # 'GNU gdb 6.1.1 [FreeBSD]\n' -> 6.1 # 'GNU gdb (GDB) Fedora (7.5.1-37.fc18)\n' -> 7.5 # 'HP gdb 6.7 for HP Itanium (32 or 64 bit) and target HP-UX 11iv2 and 11iv3.\n' -> 6.7 - match = re.search(r"^(?:GNU|HP) gdb.*?\b(\d+)\.(\d+)", version) + match = re.search(r"^(?:GNU|HP) gdb.*?\b(\d+)\.(\d+)", stdout) if match is None: - raise Exception("unable to parse GDB version: %r" % version) - return (version, int(match.group(1)), int(match.group(2))) + raise Exception("unable to parse gdb version: %r" % stdout) + version_text = stdout + major = int(match.group(1)) + minor = int(match.group(2)) + version = (major, minor) + return (version_text, version) -gdb_version, gdb_major_version, gdb_minor_version = get_gdb_version() -if gdb_major_version < 7: - raise unittest.SkipTest("gdb versions before 7.0 didn't support python " - "embedding. Saw %s.%s:\n%s" - % (gdb_major_version, gdb_minor_version, - gdb_version)) +GDB_VERSION_TEXT, GDB_VERSION = get_gdb_version() +if GDB_VERSION < (7, 0): + raise unittest.SkipTest( + f"gdb versions before 7.0 didn't support python embedding. " + f"Saw gdb version {GDB_VERSION[0]}.{GDB_VERSION[1]}:\n" + f"{GDB_VERSION_TEXT}") -if not sysconfig.is_python_build(): - raise unittest.SkipTest("test_gdb only works on source builds at the moment.") -if ((sysconfig.get_config_var('PGO_PROF_USE_FLAG') or 'xxx') in - (sysconfig.get_config_var('PY_CORE_CFLAGS') or '')): - raise unittest.SkipTest("test_gdb is not reliable on PGO builds") +def check_usable_gdb(): + # Verify that "gdb" was built with the embedded Python support enabled and + # verify that "gdb" can load our custom hooks, as OS security settings may + # disallow this without a customized .gdbinit. + stdout, stderr = run_gdb( + '--eval-command=python import sys; print(sys.version_info)', + '--args', sys.executable) -# Location of custom hooks file in a repository checkout. -checkout_hook_path = os.path.join(os.path.dirname(sys.executable), - 'python-gdb.py') + if "auto-loading has been declined" in stderr: + raise unittest.SkipTest( + f"gdb security settings prevent use of custom hooks; " + f"stderr: {stderr!r}") -PYTHONHASHSEED = '123' + if not stdout: + raise unittest.SkipTest( + f"gdb not built with embedded python support; " + f"stderr: {stderr!r}") + + if "major=2" in stdout: + raise unittest.SkipTest("gdb built with Python 2") +check_usable_gdb() + +# Control-flow enforcement technology def cet_protection(): cflags = sysconfig.get_config_var('CFLAGS') if not cflags: @@ -74,63 +136,17 @@ def cet_protection(): and any((flag.startswith('-fcf-protection') and not flag.endswith(("=none", "=return"))) for flag in flags)) - -# Control-flow enforcement technology CET_PROTECTION = cet_protection() -def run_gdb(*args, **env_vars): - """Runs gdb in --batch mode with the additional arguments given by *args. - - Returns its (stdout, stderr) decoded from utf-8 using the replace handler. - """ - if env_vars: - env = os.environ.copy() - env.update(env_vars) - else: - env = None - # -nx: Do not execute commands from any .gdbinit initialization files - # (issue #22188) - base_cmd = ('gdb', '--batch', '-nx') - if (gdb_major_version, gdb_minor_version) >= (7, 4): - base_cmd += ('-iex', 'add-auto-load-safe-path ' + checkout_hook_path) - proc = subprocess.Popen(base_cmd + args, - # Redirect stdin to prevent GDB from messing with - # the terminal settings - stdin=subprocess.PIPE, - stdout=subprocess.PIPE, - stderr=subprocess.PIPE, - env=env) - with proc: - out, err = proc.communicate() - return out.decode('utf-8', 'replace'), err.decode('utf-8', 'replace') - -# Verify that "gdb" was built with the embedded python support enabled: -gdbpy_version, _ = run_gdb("--eval-command=python import sys; print(sys.version_info)") -if not gdbpy_version: - raise unittest.SkipTest("gdb not built with embedded python support") - -if "major=2" in gdbpy_version: - raise unittest.SkipTest("gdb built with Python 2") - -# Verify that "gdb" can load our custom hooks, as OS security settings may -# disallow this without a customized .gdbinit. -_, gdbpy_errors = run_gdb('--args', sys.executable) -if "auto-loading has been declined" in gdbpy_errors: - msg = "gdb security settings prevent use of custom hooks: " - raise unittest.SkipTest(msg + gdbpy_errors.rstrip()) - -BREAKPOINT_FN='builtin_id' - - def setup_module(): if support.verbose: - print("GDB version %s.%s:" % (gdb_major_version, gdb_minor_version)) - for line in gdb_version.splitlines(): + print(f"gdb version {GDB_VERSION[0]}.{GDB_VERSION[1]}:") + for line in GDB_VERSION_TEXT.splitlines(): print(" " * 4 + line) + print() -@unittest.skipIf(support.PGO, "not useful for PGO") class DebuggerTests(unittest.TestCase): """Test that the debugger can debug Python.""" @@ -163,20 +179,22 @@ def get_stack_trace(self, source=None, script=None, # structures # Generate a list of commands in gdb's language: - commands = ['set breakpoint pending yes', - 'break %s' % breakpoint, - - # The tests assume that the first frame of printed - # backtrace will not contain program counter, - # that is however not guaranteed by gdb - # therefore we need to use 'set print address off' to - # make sure the counter is not there. For example: - # #0 in PyObject_Print ... - # is assumed, but sometimes this can be e.g. - # #0 0x00003fffb7dd1798 in PyObject_Print ... - 'set print address off', - - 'run'] + commands = [ + 'set breakpoint pending yes', + 'break %s' % breakpoint, + + # The tests assume that the first frame of printed + # backtrace will not contain program counter, + # that is however not guaranteed by gdb + # therefore we need to use 'set print address off' to + # make sure the counter is not there. For example: + # #0 in PyObject_Print ... + # is assumed, but sometimes this can be e.g. + # #0 0x00003fffb7dd1798 in PyObject_Print ... + 'set print address off', + + 'run', + ] # GDB as of 7.4 onwards can distinguish between the # value of a variable at entry vs current value: @@ -184,7 +202,7 @@ def get_stack_trace(self, source=None, script=None, # which leads to the selftests failing with errors like this: # AssertionError: 'v@entry=()' != '()' # Disable this: - if (gdb_major_version, gdb_minor_version) >= (7, 4): + if GDB_VERSION >= (7, 4): commands += ['set print entry-values no'] if cmds_after_breakpoint: @@ -237,13 +255,16 @@ def get_stack_trace(self, source=None, script=None, for pattern in ( '(frame information optimized out)', 'Unable to read information on python frame', + # gh-91960: On Python built with "clang -Og", gdb gets # "frame=" for _PyEval_EvalFrameDefault() parameter '(unable to read python frame information)', + # gh-104736: On Python built with "clang -Og" on ppc64le, # "py-bt" displays a truncated or not traceback, but "where" # logs this error message: 'Backtrace stopped: frame did not save the PC', + # gh-104736: When "bt" command displays something like: # "#1 0x0000000000000000 in ?? ()", the traceback is likely # truncated or wrong. @@ -254,42 +275,6 @@ def get_stack_trace(self, source=None, script=None, return out - def get_gdb_repr(self, source, - cmds_after_breakpoint=None, - import_site=False): - # Given an input python source representation of data, - # run "python -c'id(DATA)'" under gdb with a breakpoint on - # builtin_id and scrape out gdb's representation of the "op" - # parameter, and verify that the gdb displays the same string - # - # Verify that the gdb displays the expected string - # - # For a nested structure, the first time we hit the breakpoint will - # give us the top-level structure - - # NOTE: avoid decoding too much of the traceback as some - # undecodable characters may lurk there in optimized mode - # (issue #19743). - cmds_after_breakpoint = cmds_after_breakpoint or ["backtrace 1"] - gdb_output = self.get_stack_trace(source, breakpoint=BREAKPOINT_FN, - cmds_after_breakpoint=cmds_after_breakpoint, - import_site=import_site) - # gdb can insert additional '\n' and space characters in various places - # in its output, depending on the width of the terminal it's connected - # to (using its "wrap_here" function) - m = re.search( - # Match '#0 builtin_id(self=..., v=...)' - r'#0\s+builtin_id\s+\(self\=.*,\s+v=\s*(.*?)?\)' - # Match ' at Python/bltinmodule.c'. - # bpo-38239: builtin_id() is defined in Python/bltinmodule.c, - # but accept any "Directory\file.c" to support Link Time - # Optimization (LTO). - r'\s+at\s+\S*[A-Za-z]+/[A-Za-z0-9_-]+\.c', - gdb_output, re.DOTALL) - if not m: - self.fail('Unexpected gdb output: %r\n%s' % (gdb_output, gdb_output)) - return m.group(1), gdb_output - def assertEndsWith(self, actual, exp_end): '''Ensure that the given "actual" string ends with "exp_end"''' self.assertTrue(actual.endswith(exp_end), @@ -299,6 +284,3 @@ def assertMultilineMatches(self, actual, pattern): m = re.match(pattern, actual, re.DOTALL) if not m: self.fail(msg='%r did not match %r' % (actual, pattern)) - - def get_sample_script(self): - return os.path.join(os.path.dirname(__file__), 'gdb_sample.py') diff --git a/Misc/NEWS.d/next/Tests/2023-09-13-05-58-09.gh-issue-104736.lA25Fu.rst b/Misc/NEWS.d/next/Tests/2023-09-13-05-58-09.gh-issue-104736.lA25Fu.rst new file mode 100644 index 00000000000000..85c370fc87ac41 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-09-13-05-58-09.gh-issue-104736.lA25Fu.rst @@ -0,0 +1,4 @@ +Fix test_gdb on Python built with LLVM clang 16 on Linux ppc64le (ex: Fedora +38). Search patterns in gdb "bt" command output to detect when gdb fails to +retrieve the traceback. For example, skip a test if ``Backtrace stopped: frame +did not save the PC`` is found. Patch by Victor Stinner. From 8f22504d745c4a843ff30b6810ab606817480746 Mon Sep 17 00:00:00 2001 From: Victor Stinner Date: Wed, 4 Oct 2023 14:07:57 +0200 Subject: [PATCH 015/265] [3.11] gh-109974: Fix threading lock_tests race conditions (#110057) (#110355) [3.12] gh-109974: Fix threading lock_tests race conditions (#110057) (#110346) * gh-109974: Fix threading lock_tests race conditions (#110057) Fix race conditions in test_threading lock tests. Wait until a condition is met rather than using time.sleep() with a hardcoded number of seconds. * Replace sleeping loops with support.sleeping_retry() which raises an exception on timeout. * Add wait_threads_blocked(nthread) which computes a sleep depending on the number of threads. Remove _wait() function. * test_set_and_clear(): use a way longer Event.wait() timeout. * BarrierTests.test_repr(): wait until the 2 threads are waiting for the barrier. Use a way longer timeout for Barrier.wait() timeout. * test_thread_leak() no longer needs to count len(threading.enumerate()): Bunch uses threading_helper.wait_threads_exit() internally which does it in wait_for_finished(). * Add BaseLockTests.wait_phase() which implements a timeout. test_reacquire() and test_recursion_count() use wait_phase(). (cherry picked from commit 4e356ad183eeb567783f4a87fd092573da1e9252) * gh-109974: Fix more threading lock_tests race conditions (#110089) * Add context manager on Bunch class. * Bunch now catchs exceptions on executed functions and re-raise them at __exit__() as an ExceptionGroup. * Rewrite BarrierProxy.test_default_timeout(). Use a single thread. Only check that barrier.wait() blocks for at least default timeout seconds. * test_with(): inline _with() function. (cherry picked from commit 743e3572ee940a6cf88fd518e5f4a447905ba5eb) (cherry picked from commit 1d032ea3d67e9725b63322f896d9aa727fd75521) --- Lib/test/lock_tests.py | 623 +++++++++++------- Lib/test/test_importlib/test_locks.py | 3 +- ...-09-29-00-19-21.gh-issue-109974.Sh_g-r.rst | 3 + 3 files changed, 378 insertions(+), 251 deletions(-) create mode 100644 Misc/NEWS.d/next/Tests/2023-09-29-00-19-21.gh-issue-109974.Sh_g-r.rst diff --git a/Lib/test/lock_tests.py b/Lib/test/lock_tests.py index d5ac1319064721..c5502ac0cdbddf 100644 --- a/Lib/test/lock_tests.py +++ b/Lib/test/lock_tests.py @@ -20,54 +20,74 @@ "(no _at_fork_reinit method)") -def _wait(): - # A crude wait/yield function not relying on synchronization primitives. - time.sleep(0.01) +def wait_threads_blocked(nthread): + # Arbitrary sleep to wait until N threads are blocked, + # like waiting for a lock. + time.sleep(0.010 * nthread) + class Bunch(object): """ A bunch of threads. """ - def __init__(self, f, n, wait_before_exit=False): + def __init__(self, func, nthread, wait_before_exit=False): """ - Construct a bunch of `n` threads running the same function `f`. + Construct a bunch of `nthread` threads running the same function `func`. If `wait_before_exit` is True, the threads won't terminate until do_finish() is called. """ - self.f = f - self.n = n + self.func = func + self.nthread = nthread self.started = [] self.finished = [] + self.exceptions = [] self._can_exit = not wait_before_exit - self.wait_thread = threading_helper.wait_threads_exit() - self.wait_thread.__enter__() + self._wait_thread = None - def task(): - tid = threading.get_ident() - self.started.append(tid) - try: - f() - finally: - self.finished.append(tid) - while not self._can_exit: - _wait() + def task(self): + tid = threading.get_ident() + self.started.append(tid) + try: + self.func() + except BaseException as exc: + self.exceptions.append(exc) + finally: + self.finished.append(tid) + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if self._can_exit: + break + + def __enter__(self): + self._wait_thread = threading_helper.wait_threads_exit(support.SHORT_TIMEOUT) + self._wait_thread.__enter__() try: - for i in range(n): - start_new_thread(task, ()) + for _ in range(self.nthread): + start_new_thread(self.task, ()) except: self._can_exit = True raise - def wait_for_started(self): - while len(self.started) < self.n: - _wait() + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if len(self.started) >= self.nthread: + break + + return self + + def __exit__(self, exc_type, exc_value, traceback): + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if len(self.finished) >= self.nthread: + break - def wait_for_finished(self): - while len(self.finished) < self.n: - _wait() - # Wait for threads exit - self.wait_thread.__exit__(None, None, None) + # Wait until threads completely exit according to _thread._count() + self._wait_thread.__exit__(None, None, None) + + # Break reference cycle + exceptions = self.exceptions + self.exceptions = None + if exceptions: + raise ExceptionGroup(f"{self.func} threads raised exceptions", + exceptions) def do_finish(self): self._can_exit = True @@ -95,6 +115,12 @@ class BaseLockTests(BaseTestCase): Tests for both recursive and non-recursive locks. """ + def wait_phase(self, phase, expected): + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if len(phase) >= expected: + break + self.assertEqual(len(phase), expected) + def test_constructor(self): lock = self.locktype() del lock @@ -132,41 +158,57 @@ def test_try_acquire_contended(self): result = [] def f(): result.append(lock.acquire(False)) - Bunch(f, 1).wait_for_finished() + with Bunch(f, 1): + pass self.assertFalse(result[0]) lock.release() def test_acquire_contended(self): lock = self.locktype() lock.acquire() - N = 5 def f(): lock.acquire() lock.release() - b = Bunch(f, N) - b.wait_for_started() - _wait() - self.assertEqual(len(b.finished), 0) - lock.release() - b.wait_for_finished() - self.assertEqual(len(b.finished), N) + N = 5 + with Bunch(f, N) as bunch: + # Threads block on lock.acquire() + wait_threads_blocked(N) + self.assertEqual(len(bunch.finished), 0) + + # Threads unblocked + lock.release() + + self.assertEqual(len(bunch.finished), N) def test_with(self): lock = self.locktype() def f(): lock.acquire() lock.release() - def _with(err=None): + + def with_lock(err=None): with lock: if err is not None: raise err - _with() - # Check the lock is unacquired - Bunch(f, 1).wait_for_finished() - self.assertRaises(TypeError, _with, TypeError) - # Check the lock is unacquired - Bunch(f, 1).wait_for_finished() + + # Acquire the lock, do nothing, with releases the lock + with lock: + pass + + # Check that the lock is unacquired + with Bunch(f, 1): + pass + + # Acquire the lock, raise an exception, with releases the lock + with self.assertRaises(TypeError): + with lock: + raise TypeError + + # Check that the lock is unacquired even if after an exception + # was raised in the previous "with lock:" block + with Bunch(f, 1): + pass def test_thread_leak(self): # The lock shouldn't leak a Thread instance when used from a foreign @@ -175,17 +217,11 @@ def test_thread_leak(self): def f(): lock.acquire() lock.release() - n = len(threading.enumerate()) + # We run many threads in the hope that existing threads ids won't # be recycled. - Bunch(f, 15).wait_for_finished() - if len(threading.enumerate()) != n: - # There is a small window during which a Thread instance's - # target function has finished running, but the Thread is still - # alive and registered. Avoid spurious failures by waiting a - # bit more (seen on a buildbot). - time.sleep(0.4) - self.assertEqual(n, len(threading.enumerate())) + with Bunch(f, 15): + pass def test_timeout(self): lock = self.locktype() @@ -209,7 +245,8 @@ def f(): results.append(lock.acquire(timeout=0.5)) t2 = time.monotonic() results.append(t2 - t1) - Bunch(f, 1).wait_for_finished() + with Bunch(f, 1): + pass self.assertFalse(results[0]) self.assertTimeout(results[1], 0.5) @@ -243,15 +280,13 @@ def f(): phase.append(None) with threading_helper.wait_threads_exit(): + # Thread blocked on lock.acquire() start_new_thread(f, ()) - while len(phase) == 0: - _wait() - _wait() - self.assertEqual(len(phase), 1) + self.wait_phase(phase, 1) + + # Thread unblocked lock.release() - while len(phase) == 1: - _wait() - self.assertEqual(len(phase), 2) + self.wait_phase(phase, 2) def test_different_thread(self): # Lock can be released from a different thread. @@ -259,8 +294,8 @@ def test_different_thread(self): lock.acquire() def f(): lock.release() - b = Bunch(f, 1) - b.wait_for_finished() + with Bunch(f, 1): + pass lock.acquire() lock.release() @@ -350,21 +385,20 @@ def test_recursion_count(self): def f(): lock.acquire() phase.append(None) - while len(phase) == 1: - _wait() + + self.wait_phase(phase, 2) lock.release() phase.append(None) with threading_helper.wait_threads_exit(): + # Thread blocked on lock.acquire() start_new_thread(f, ()) - while len(phase) == 0: - _wait() - self.assertEqual(len(phase), 1) + self.wait_phase(phase, 1) self.assertEqual(0, lock._recursion_count()) + + # Thread unblocked phase.append(None) - while len(phase) == 2: - _wait() - self.assertEqual(len(phase), 3) + self.wait_phase(phase, 3) self.assertEqual(0, lock._recursion_count()) def test_different_thread(self): @@ -372,12 +406,12 @@ def test_different_thread(self): lock = self.locktype() def f(): lock.acquire() - b = Bunch(f, 1, True) - try: - self.assertRaises(RuntimeError, lock.release) - finally: - b.do_finish() - b.wait_for_finished() + + with Bunch(f, 1, True) as bunch: + try: + self.assertRaises(RuntimeError, lock.release) + finally: + bunch.do_finish() def test__is_owned(self): lock = self.locktype() @@ -389,7 +423,8 @@ def test__is_owned(self): result = [] def f(): result.append(lock._is_owned()) - Bunch(f, 1).wait_for_finished() + with Bunch(f, 1): + pass self.assertFalse(result[0]) lock.release() self.assertTrue(lock._is_owned()) @@ -422,12 +457,15 @@ def _check_notify(self, evt): def f(): results1.append(evt.wait()) results2.append(evt.wait()) - b = Bunch(f, N) - b.wait_for_started() - _wait() - self.assertEqual(len(results1), 0) - evt.set() - b.wait_for_finished() + + with Bunch(f, N): + # Threads blocked on first evt.wait() + wait_threads_blocked(N) + self.assertEqual(len(results1), 0) + + # Threads unblocked + evt.set() + self.assertEqual(results1, [True] * N) self.assertEqual(results2, [True] * N) @@ -450,35 +488,43 @@ def f(): r = evt.wait(0.5) t2 = time.monotonic() results2.append((r, t2 - t1)) - Bunch(f, N).wait_for_finished() + + with Bunch(f, N): + pass + self.assertEqual(results1, [False] * N) for r, dt in results2: self.assertFalse(r) self.assertTimeout(dt, 0.5) + # The event is set results1 = [] results2 = [] evt.set() - Bunch(f, N).wait_for_finished() + with Bunch(f, N): + pass + self.assertEqual(results1, [True] * N) for r, dt in results2: self.assertTrue(r) def test_set_and_clear(self): - # Issue #13502: check that wait() returns true even when the event is + # gh-57711: check that wait() returns true even when the event is # cleared before the waiting thread is woken up. - evt = self.eventtype() + event = self.eventtype() results = [] - timeout = 0.250 - N = 5 def f(): - results.append(evt.wait(timeout * 4)) - b = Bunch(f, N) - b.wait_for_started() - time.sleep(timeout) - evt.set() - evt.clear() - b.wait_for_finished() + results.append(event.wait(support.LONG_TIMEOUT)) + + N = 5 + with Bunch(f, N): + # Threads blocked on event.wait() + wait_threads_blocked(N) + + # Threads unblocked + event.set() + event.clear() + self.assertEqual(results, [True] * N) @requires_fork @@ -534,15 +580,14 @@ def _check_notify(self, cond): # Note that this test is sensitive to timing. If the worker threads # don't execute in a timely fashion, the main thread may think they # are further along then they are. The main thread therefore issues - # _wait() statements to try to make sure that it doesn't race ahead - # of the workers. + # wait_threads_blocked() statements to try to make sure that it doesn't + # race ahead of the workers. # Secondly, this test assumes that condition variables are not subject # to spurious wakeups. The absence of spurious wakeups is an implementation # detail of Condition Variables in current CPython, but in general, not # a guaranteed property of condition variables as a programming # construct. In particular, it is possible that this can no longer # be conveniently guaranteed should their implementation ever change. - N = 5 ready = [] results1 = [] results2 = [] @@ -551,58 +596,83 @@ def f(): cond.acquire() ready.append(phase_num) result = cond.wait() + cond.release() results1.append((result, phase_num)) + cond.acquire() ready.append(phase_num) + result = cond.wait() cond.release() results2.append((result, phase_num)) - b = Bunch(f, N) - b.wait_for_started() - # first wait, to ensure all workers settle into cond.wait() before - # we continue. See issues #8799 and #30727. - while len(ready) < 5: - _wait() - ready.clear() - self.assertEqual(results1, []) - # Notify 3 threads at first - cond.acquire() - cond.notify(3) - _wait() - phase_num = 1 - cond.release() - while len(results1) < 3: - _wait() - self.assertEqual(results1, [(True, 1)] * 3) - self.assertEqual(results2, []) - # make sure all awaken workers settle into cond.wait() - while len(ready) < 3: - _wait() - # Notify 5 threads: they might be in their first or second wait - cond.acquire() - cond.notify(5) - _wait() - phase_num = 2 - cond.release() - while len(results1) + len(results2) < 8: - _wait() - self.assertEqual(results1, [(True, 1)] * 3 + [(True, 2)] * 2) - self.assertEqual(results2, [(True, 2)] * 3) - # make sure all workers settle into cond.wait() - while len(ready) < 5: - _wait() - # Notify all threads: they are all in their second wait - cond.acquire() - cond.notify_all() - _wait() - phase_num = 3 - cond.release() - while len(results2) < 5: - _wait() - self.assertEqual(results1, [(True, 1)] * 3 + [(True,2)] * 2) - self.assertEqual(results2, [(True, 2)] * 3 + [(True, 3)] * 2) - b.wait_for_finished() + + N = 5 + with Bunch(f, N): + # first wait, to ensure all workers settle into cond.wait() before + # we continue. See issues #8799 and #30727. + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if len(ready) >= N: + break + + ready.clear() + self.assertEqual(results1, []) + + # Notify 3 threads at first + count1 = 3 + cond.acquire() + cond.notify(count1) + wait_threads_blocked(count1) + + # Phase 1 + phase_num = 1 + cond.release() + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if len(results1) >= count1: + break + + self.assertEqual(results1, [(True, 1)] * count1) + self.assertEqual(results2, []) + + # Wait until awaken workers are blocked on cond.wait() + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if len(ready) >= count1 : + break + + # Notify 5 threads: they might be in their first or second wait + cond.acquire() + cond.notify(5) + wait_threads_blocked(N) + + # Phase 2 + phase_num = 2 + cond.release() + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if len(results1) + len(results2) >= (N + count1): + break + + count2 = N - count1 + self.assertEqual(results1, [(True, 1)] * count1 + [(True, 2)] * count2) + self.assertEqual(results2, [(True, 2)] * count1) + + # Make sure all workers settle into cond.wait() + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if len(ready) >= N: + break + + # Notify all threads: they are all in their second wait + cond.acquire() + cond.notify_all() + wait_threads_blocked(N) + + # Phase 3 + phase_num = 3 + cond.release() + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if len(results2) >= N: + break + self.assertEqual(results1, [(True, 1)] * count1 + [(True, 2)] * count2) + self.assertEqual(results2, [(True, 2)] * count1 + [(True, 3)] * count2) def test_notify(self): cond = self.condtype() @@ -612,19 +682,23 @@ def test_notify(self): def test_timeout(self): cond = self.condtype() + timeout = 0.5 results = [] - N = 5 def f(): cond.acquire() t1 = time.monotonic() - result = cond.wait(0.5) + result = cond.wait(timeout) t2 = time.monotonic() cond.release() results.append((t2 - t1, result)) - Bunch(f, N).wait_for_finished() + + N = 5 + with Bunch(f, N): + pass self.assertEqual(len(results), N) + for dt, result in results: - self.assertTimeout(dt, 0.5) + self.assertTimeout(dt, timeout) # Note that conceptually (that"s the condition variable protocol) # a wait() may succeed even if no one notifies us and before any # timeout occurs. Spurious wakeups can occur. @@ -637,17 +711,16 @@ def test_waitfor(self): state = 0 def f(): with cond: - result = cond.wait_for(lambda : state==4) + result = cond.wait_for(lambda: state == 4) self.assertTrue(result) self.assertEqual(state, 4) - b = Bunch(f, 1) - b.wait_for_started() - for i in range(4): - time.sleep(0.01) - with cond: - state += 1 - cond.notify() - b.wait_for_finished() + + with Bunch(f, 1): + for i in range(4): + time.sleep(0.010) + with cond: + state += 1 + cond.notify() def test_waitfor_timeout(self): cond = self.condtype() @@ -661,15 +734,15 @@ def f(): self.assertFalse(result) self.assertTimeout(dt, 0.1) success.append(None) - b = Bunch(f, 1) - b.wait_for_started() - # Only increment 3 times, so state == 4 is never reached. - for i in range(3): - time.sleep(0.01) - with cond: - state += 1 - cond.notify() - b.wait_for_finished() + + with Bunch(f, 1): + # Only increment 3 times, so state == 4 is never reached. + for i in range(3): + time.sleep(0.010) + with cond: + state += 1 + cond.notify() + self.assertEqual(len(success), 1) @@ -698,73 +771,107 @@ def test_acquire_destroy(self): del sem def test_acquire_contended(self): - sem = self.semtype(7) + sem_value = 7 + sem = self.semtype(sem_value) sem.acquire() - N = 10 + sem_results = [] results1 = [] results2 = [] phase_num = 0 - def f(): + + def func(): sem_results.append(sem.acquire()) results1.append(phase_num) + sem_results.append(sem.acquire()) results2.append(phase_num) - b = Bunch(f, 10) - b.wait_for_started() - while len(results1) + len(results2) < 6: - _wait() - self.assertEqual(results1 + results2, [0] * 6) - phase_num = 1 - for i in range(7): - sem.release() - while len(results1) + len(results2) < 13: - _wait() - self.assertEqual(sorted(results1 + results2), [0] * 6 + [1] * 7) - phase_num = 2 - for i in range(6): + + def wait_count(count): + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if len(results1) + len(results2) >= count: + break + + N = 10 + with Bunch(func, N): + # Phase 0 + count1 = sem_value - 1 + wait_count(count1) + self.assertEqual(results1 + results2, [0] * count1) + + # Phase 1 + phase_num = 1 + for i in range(sem_value): + sem.release() + count2 = sem_value + wait_count(count1 + count2) + self.assertEqual(sorted(results1 + results2), + [0] * count1 + [1] * count2) + + # Phase 2 + phase_num = 2 + count3 = (sem_value - 1) + for i in range(count3): + sem.release() + wait_count(count1 + count2 + count3) + self.assertEqual(sorted(results1 + results2), + [0] * count1 + [1] * count2 + [2] * count3) + # The semaphore is still locked + self.assertFalse(sem.acquire(False)) + + # Final release, to let the last thread finish + count4 = 1 sem.release() - while len(results1) + len(results2) < 19: - _wait() - self.assertEqual(sorted(results1 + results2), [0] * 6 + [1] * 7 + [2] * 6) - # The semaphore is still locked - self.assertFalse(sem.acquire(False)) - # Final release, to let the last thread finish - sem.release() - b.wait_for_finished() - self.assertEqual(sem_results, [True] * (6 + 7 + 6 + 1)) + + self.assertEqual(sem_results, + [True] * (count1 + count2 + count3 + count4)) def test_multirelease(self): - sem = self.semtype(7) + sem_value = 7 + sem = self.semtype(sem_value) sem.acquire() + results1 = [] results2 = [] phase_num = 0 - def f(): + def func(): sem.acquire() results1.append(phase_num) + sem.acquire() results2.append(phase_num) - b = Bunch(f, 10) - b.wait_for_started() - while len(results1) + len(results2) < 6: - _wait() - self.assertEqual(results1 + results2, [0] * 6) - phase_num = 1 - sem.release(7) - while len(results1) + len(results2) < 13: - _wait() - self.assertEqual(sorted(results1 + results2), [0] * 6 + [1] * 7) - phase_num = 2 - sem.release(6) - while len(results1) + len(results2) < 19: - _wait() - self.assertEqual(sorted(results1 + results2), [0] * 6 + [1] * 7 + [2] * 6) - # The semaphore is still locked - self.assertFalse(sem.acquire(False)) - # Final release, to let the last thread finish - sem.release() - b.wait_for_finished() + + def wait_count(count): + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if len(results1) + len(results2) >= count: + break + + with Bunch(func, 10): + # Phase 0 + count1 = sem_value - 1 + wait_count(count1) + self.assertEqual(results1 + results2, [0] * count1) + + # Phase 1 + phase_num = 1 + count2 = sem_value + sem.release(count2) + wait_count(count1 + count2) + self.assertEqual(sorted(results1 + results2), + [0] * count1 + [1] * count2) + + # Phase 2 + phase_num = 2 + count3 = sem_value - 1 + sem.release(count3) + wait_count(count1 + count2 + count3) + self.assertEqual(sorted(results1 + results2), + [0] * count1 + [1] * count2 + [2] * count3) + # The semaphore is still locked + self.assertFalse(sem.acquire(False)) + + # Final release, to let the last thread finish + sem.release() def test_try_acquire(self): sem = self.semtype(2) @@ -781,7 +888,8 @@ def test_try_acquire_contended(self): def f(): results.append(sem.acquire(False)) results.append(sem.acquire(False)) - Bunch(f, 5).wait_for_finished() + with Bunch(f, 5): + pass # There can be a thread switch between acquiring the semaphore and # appending the result, therefore results will not necessarily be # ordered. @@ -807,12 +915,14 @@ def test_default_value(self): def f(): sem.acquire() sem.release() - b = Bunch(f, 1) - b.wait_for_started() - _wait() - self.assertFalse(b.finished) - sem.release() - b.wait_for_finished() + + with Bunch(f, 1) as bunch: + # Thread blocked on sem.acquire() + wait_threads_blocked(1) + self.assertFalse(bunch.finished) + + # Thread unblocked + sem.release() def test_with(self): sem = self.semtype(2) @@ -883,13 +993,13 @@ class BarrierTests(BaseTestCase): def setUp(self): self.barrier = self.barriertype(self.N, timeout=self.defaultTimeout) + def tearDown(self): self.barrier.abort() def run_threads(self, f): - b = Bunch(f, self.N-1) - f() - b.wait_for_finished() + with Bunch(f, self.N): + pass def multipass(self, results, n): m = self.barrier.parties @@ -980,8 +1090,9 @@ def f(): i = self.barrier.wait() if i == self.N//2: # Wait until the other threads are all in the barrier. - while self.barrier.n_waiting < self.N-1: - time.sleep(0.001) + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if self.barrier.n_waiting >= (self.N - 1): + break self.barrier.reset() else: try: @@ -1041,27 +1152,27 @@ def f(): i = self.barrier.wait() if i == self.N // 2: # One thread is late! - time.sleep(1.0) + time.sleep(self.defaultTimeout / 2) # Default timeout is 2.0, so this is shorter. self.assertRaises(threading.BrokenBarrierError, - self.barrier.wait, 0.5) + self.barrier.wait, self.defaultTimeout / 4) self.run_threads(f) def test_default_timeout(self): """ Test the barrier's default timeout """ - # gh-109401: Barrier timeout should be long enough - # to create 4 threads on a slow CI. - timeout = 1.0 - barrier = self.barriertype(self.N, timeout=timeout) + timeout = 0.100 + barrier = self.barriertype(2, timeout=timeout) def f(): - i = barrier.wait() - if i == self.N // 2: - # One thread is later than the default timeout. - time.sleep(timeout * 2) - self.assertRaises(threading.BrokenBarrierError, barrier.wait) - self.run_threads(f) + self.assertRaises(threading.BrokenBarrierError, + barrier.wait) + + start_time = time.monotonic() + with Bunch(f, 1): + pass + dt = time.monotonic() - start_time + self.assertGreaterEqual(dt, timeout) def test_single_thread(self): b = self.barriertype(1) @@ -1069,16 +1180,28 @@ def test_single_thread(self): b.wait() def test_repr(self): - b = self.barriertype(3) - self.assertRegex(repr(b), r"<\w+\.Barrier at .*: waiters=0/3>") + barrier = self.barriertype(3) + timeout = support.LONG_TIMEOUT + self.assertRegex(repr(barrier), r"<\w+\.Barrier at .*: waiters=0/3>") def f(): - b.wait(3) - bunch = Bunch(f, 2) - bunch.wait_for_started() - time.sleep(0.2) - self.assertRegex(repr(b), r"<\w+\.Barrier at .*: waiters=2/3>") - b.wait(3) - bunch.wait_for_finished() - self.assertRegex(repr(b), r"<\w+\.Barrier at .*: waiters=0/3>") - b.abort() - self.assertRegex(repr(b), r"<\w+\.Barrier at .*: broken>") + barrier.wait(timeout) + + N = 2 + with Bunch(f, N): + # Threads blocked on barrier.wait() + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if barrier.n_waiting >= N: + break + self.assertRegex(repr(barrier), + r"<\w+\.Barrier at .*: waiters=2/3>") + + # Threads unblocked + barrier.wait(timeout) + + self.assertRegex(repr(barrier), + r"<\w+\.Barrier at .*: waiters=0/3>") + + # Abort the barrier + barrier.abort() + self.assertRegex(repr(barrier), + r"<\w+\.Barrier at .*: broken>") diff --git a/Lib/test/test_importlib/test_locks.py b/Lib/test/test_importlib/test_locks.py index 5549d5448bac1f..986f19690cd99f 100644 --- a/Lib/test/test_importlib/test_locks.py +++ b/Lib/test/test_importlib/test_locks.py @@ -88,7 +88,8 @@ def f(): b.release() if ra: a.release() - lock_tests.Bunch(f, NTHREADS).wait_for_finished() + with lock_tests.Bunch(f, NTHREADS): + pass self.assertEqual(len(results), NTHREADS) return results diff --git a/Misc/NEWS.d/next/Tests/2023-09-29-00-19-21.gh-issue-109974.Sh_g-r.rst b/Misc/NEWS.d/next/Tests/2023-09-29-00-19-21.gh-issue-109974.Sh_g-r.rst new file mode 100644 index 00000000000000..a130cf690a57cb --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-09-29-00-19-21.gh-issue-109974.Sh_g-r.rst @@ -0,0 +1,3 @@ +Fix race conditions in test_threading lock tests. Wait until a condition is met +rather than using :func:`time.sleep` with a hardcoded number of seconds. Patch +by Victor Stinner. From b0e43cb6cb99b5ebe62d00727d54c4f05f5a739e Mon Sep 17 00:00:00 2001 From: Hugo van Kemenade Date: Wed, 4 Oct 2023 08:30:48 -0600 Subject: [PATCH 016/265] [3.11] Lint: Remove files that no longer fail to parse (GH-110356) (#110361) Remove files that no longer fail to parse --- Lib/test/.ruff.toml | 2 -- 1 file changed, 2 deletions(-) diff --git a/Lib/test/.ruff.toml b/Lib/test/.ruff.toml index bd46f98512d6e2..7ef54160ec6a81 100644 --- a/Lib/test/.ruff.toml +++ b/Lib/test/.ruff.toml @@ -9,8 +9,6 @@ extend-exclude = [ "encoded_modules/module_iso_8859_1.py", "encoded_modules/module_koi8_r.py", "test_source_encoding.py", - # Failed to parse - "test_fstring.py", # TODO Fix: F811 Redefinition of unused name "test_buffer.py", "test_dataclasses/__init__.py", From 34c7793cd04526827495293a098db527889d7873 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 5 Oct 2023 05:21:34 -0700 Subject: [PATCH 017/265] [3.11] gh-110365: Fix error overwrite in `termios.tcsetattr` (GH-110366) (#110390) (cherry picked from commit 2bbbab212fb10b3aeaded188fb5d6c001fb4bf74) Co-authored-by: Nikita Sobolev Co-authored-by: Erlend E. Aasland --- ...-10-04-18-56-29.gh-issue-110365.LCxiau.rst | 2 + Modules/termios.c | 39 ++++++++++++------- 2 files changed, 28 insertions(+), 13 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-10-04-18-56-29.gh-issue-110365.LCxiau.rst diff --git a/Misc/NEWS.d/next/Library/2023-10-04-18-56-29.gh-issue-110365.LCxiau.rst b/Misc/NEWS.d/next/Library/2023-10-04-18-56-29.gh-issue-110365.LCxiau.rst new file mode 100644 index 00000000000000..a1ac39b60296a3 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-10-04-18-56-29.gh-issue-110365.LCxiau.rst @@ -0,0 +1,2 @@ +Fix :func:`termios.tcsetattr` bug that was overwritting existing errors +during parsing integers from ``term`` list. diff --git a/Modules/termios.c b/Modules/termios.c index 3900a6f0b89860..be5d099072c8ee 100644 --- a/Modules/termios.c +++ b/Modules/termios.c @@ -183,17 +183,25 @@ termios_tcsetattr_impl(PyObject *module, int fd, int when, PyObject *term) return PyErr_SetFromErrno(state->TermiosError); } - mode.c_iflag = (tcflag_t) PyLong_AsLong(PyList_GetItem(term, 0)); - mode.c_oflag = (tcflag_t) PyLong_AsLong(PyList_GetItem(term, 1)); - mode.c_cflag = (tcflag_t) PyLong_AsLong(PyList_GetItem(term, 2)); - mode.c_lflag = (tcflag_t) PyLong_AsLong(PyList_GetItem(term, 3)); - speed_t ispeed = (speed_t) PyLong_AsLong(PyList_GetItem(term, 4)); - speed_t ospeed = (speed_t) PyLong_AsLong(PyList_GetItem(term, 5)); - PyObject *cc = PyList_GetItem(term, 6); - if (PyErr_Occurred()) { - return NULL; - } - + speed_t ispeed, ospeed; +#define SET_FROM_LIST(TYPE, VAR, LIST, N) do { \ + PyObject *item = PyList_GET_ITEM(LIST, N); \ + long num = PyLong_AsLong(item); \ + if (num == -1 && PyErr_Occurred()) { \ + return NULL; \ + } \ + VAR = (TYPE)num; \ +} while (0) + + SET_FROM_LIST(tcflag_t, mode.c_iflag, term, 0); + SET_FROM_LIST(tcflag_t, mode.c_oflag, term, 1); + SET_FROM_LIST(tcflag_t, mode.c_cflag, term, 2); + SET_FROM_LIST(tcflag_t, mode.c_lflag, term, 3); + SET_FROM_LIST(speed_t, ispeed, term, 4); + SET_FROM_LIST(speed_t, ospeed, term, 5); +#undef SET_FROM_LIST + + PyObject *cc = PyList_GET_ITEM(term, 6); if (!PyList_Check(cc) || PyList_Size(cc) != NCCS) { PyErr_Format(PyExc_TypeError, "tcsetattr: attributes[6] must be %d element list", @@ -208,8 +216,13 @@ termios_tcsetattr_impl(PyObject *module, int fd, int when, PyObject *term) if (PyBytes_Check(v) && PyBytes_Size(v) == 1) mode.c_cc[i] = (cc_t) * PyBytes_AsString(v); - else if (PyLong_Check(v)) - mode.c_cc[i] = (cc_t) PyLong_AsLong(v); + else if (PyLong_Check(v)) { + long num = PyLong_AsLong(v); + if (num == -1 && PyErr_Occurred()) { + return NULL; + } + mode.c_cc[i] = (cc_t)num; + } else { PyErr_SetString(PyExc_TypeError, "tcsetattr: elements of attributes must be characters or integers"); From cacea7a27260f56f6a29f0d7d37878c569d59cfb Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 5 Oct 2023 09:58:46 -0700 Subject: [PATCH 018/265] [3.11] gh-110391: socket NetworkConnectionAttributesTest always declare cli (GH-110401) (#110406) gh-110391: socket NetworkConnectionAttributesTest always declare cli (GH-110401) NetworkConnectionAttributesTest of test_socket now always declare the 'cli' attribute, so clientTearDown() cannot fail with AttributeError. (cherry picked from commit e37d4557c3de0476e76ca4b8a1cc8d2566b86c79) Co-authored-by: Victor Stinner --- Lib/test/test_socket.py | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py index af0d2a4e6f630a..190efb416ba39a 100644 --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -5239,6 +5239,7 @@ def test_create_connection_timeout(self): class NetworkConnectionAttributesTest(SocketTCPTest, ThreadableTest): + cli = None def __init__(self, methodName='runTest'): SocketTCPTest.__init__(self, methodName=methodName) @@ -5248,7 +5249,8 @@ def clientSetUp(self): self.source_port = socket_helper.find_unused_port() def clientTearDown(self): - self.cli.close() + if self.cli is not None: + self.cli.close() self.cli = None ThreadableTest.clientTearDown(self) From 6a6081f820cd850f1a4cac4a1ba28952bdf4a706 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 5 Oct 2023 10:10:39 -0700 Subject: [PATCH 019/265] [3.11] gh-110383 TimeIt Docs Spelling Fix (GH-110407) (#110410) gh-110383 TimeIt Docs Spelling Fix (GH-110407) Make 0.2 second plural (cherry picked from commit a973bf0f97e55ace9eab100f9eb95d7eedcb28ac) Co-authored-by: Towster15 <105541074+Towster15@users.noreply.github.com> --- Doc/library/timeit.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/library/timeit.rst b/Doc/library/timeit.rst index b59a6bc3742ae4..d0abea38817234 100644 --- a/Doc/library/timeit.rst +++ b/Doc/library/timeit.rst @@ -151,7 +151,7 @@ The module defines three convenience functions and a public class: so that the total time >= 0.2 second, returning the eventual (number of loops, time taken for that number of loops). It calls :meth:`.timeit` with increasing numbers from the sequence 1, 2, 5, - 10, 20, 50, ... until the time taken is at least 0.2 second. + 10, 20, 50, ... until the time taken is at least 0.2 seconds. If *callback* is given and is not ``None``, it will be called after each trial with two arguments: ``callback(number, time_taken)``. From 8394368f1fce672ad25af512063a0f8f5d8a08f9 Mon Sep 17 00:00:00 2001 From: Adam Turner <9087854+AA-Turner@users.noreply.github.com> Date: Thu, 5 Oct 2023 18:30:26 +0100 Subject: [PATCH 020/265] [3.11] Docs: Avoid the deprecated ``.. cmdoption::`` directive (GH-110292) (#110303) [3.11] Docs: Avoid the deprecated ``.. cmdoption::`` directive (GH-110292). (cherry picked from commit 77e9aae3837d9f0cf87461d023896f2c4aeb282f) --- Doc/library/ast.rst | 14 ++--- Doc/library/compileall.rst | 34 +++++----- Doc/library/gzip.rst | 10 +-- Doc/library/inspect.rst | 2 +- Doc/library/json.rst | 14 ++--- Doc/library/pickletools.rst | 10 +-- Doc/library/py_compile.rst | 6 +- Doc/library/site.rst | 4 +- Doc/library/tarfile.rst | 20 +++--- Doc/library/timeit.rst | 14 ++--- Doc/library/tokenize.rst | 4 +- Doc/library/trace.rst | 30 ++++----- Doc/library/unittest.rst | 18 +++--- Doc/library/zipapp.rst | 12 ++-- Doc/library/zipfile.rst | 18 +++--- Doc/using/cmdline.rst | 60 +++++++++--------- Doc/using/configure.rst | 120 ++++++++++++++++++------------------ 17 files changed, 195 insertions(+), 195 deletions(-) diff --git a/Doc/library/ast.rst b/Doc/library/ast.rst index 6392f7318d7a6d..e95e18c1778069 100644 --- a/Doc/library/ast.rst +++ b/Doc/library/ast.rst @@ -2337,26 +2337,26 @@ The following options are accepted: .. program:: ast -.. cmdoption:: -h, --help +.. option:: -h, --help Show the help message and exit. -.. cmdoption:: -m - --mode +.. option:: -m + --mode Specify what kind of code must be compiled, like the *mode* argument in :func:`parse`. -.. cmdoption:: --no-type-comments +.. option:: --no-type-comments Don't parse type comments. -.. cmdoption:: -a, --include-attributes +.. option:: -a, --include-attributes Include attributes such as line numbers and column offsets. -.. cmdoption:: -i - --indent +.. option:: -i + --indent Indentation of nodes in AST (number of spaces). diff --git a/Doc/library/compileall.rst b/Doc/library/compileall.rst index 75b97d6ff4588e..a1482c9eb889e8 100644 --- a/Doc/library/compileall.rst +++ b/Doc/library/compileall.rst @@ -24,28 +24,28 @@ compile Python sources. .. program:: compileall -.. cmdoption:: directory ... - file ... +.. option:: directory ... + file ... Positional arguments are files to compile or directories that contain source files, traversed recursively. If no argument is given, behave as if the command line was :samp:`-l {}`. -.. cmdoption:: -l +.. option:: -l Do not recurse into subdirectories, only compile source code files directly contained in the named or implied directories. -.. cmdoption:: -f +.. option:: -f Force rebuild even if timestamps are up-to-date. -.. cmdoption:: -q +.. option:: -q Do not print the list of files compiled. If passed once, error messages will still be printed. If passed twice (``-qq``), all output is suppressed. -.. cmdoption:: -d destdir +.. option:: -d destdir Directory prepended to the path to each file being compiled. This will appear in compilation time tracebacks, and is also compiled in to the @@ -53,45 +53,45 @@ compile Python sources. cases where the source file does not exist at the time the byte-code file is executed. -.. cmdoption:: -s strip_prefix -.. cmdoption:: -p prepend_prefix +.. option:: -s strip_prefix +.. option:: -p prepend_prefix Remove (``-s``) or append (``-p``) the given prefix of paths recorded in the ``.pyc`` files. Cannot be combined with ``-d``. -.. cmdoption:: -x regex +.. option:: -x regex regex is used to search the full path to each file considered for compilation, and if the regex produces a match, the file is skipped. -.. cmdoption:: -i list +.. option:: -i list Read the file ``list`` and add each line that it contains to the list of files and directories to compile. If ``list`` is ``-``, read lines from ``stdin``. -.. cmdoption:: -b +.. option:: -b Write the byte-code files to their legacy locations and names, which may overwrite byte-code files created by another version of Python. The default is to write files to their :pep:`3147` locations and names, which allows byte-code files from multiple versions of Python to coexist. -.. cmdoption:: -r +.. option:: -r Control the maximum recursion level for subdirectories. If this is given, then ``-l`` option will not be taken into account. :program:`python -m compileall -r 0` is equivalent to :program:`python -m compileall -l`. -.. cmdoption:: -j N +.. option:: -j N Use *N* workers to compile the files within the given directory. If ``0`` is used, then the result of :func:`os.cpu_count()` will be used. -.. cmdoption:: --invalidation-mode [timestamp|checked-hash|unchecked-hash] +.. option:: --invalidation-mode [timestamp|checked-hash|unchecked-hash] Control how the generated byte-code files are invalidated at runtime. The ``timestamp`` value, means that ``.pyc`` files with the source timestamp @@ -104,17 +104,17 @@ compile Python sources. variable is not set, and ``checked-hash`` if the ``SOURCE_DATE_EPOCH`` environment variable is set. -.. cmdoption:: -o level +.. option:: -o level Compile with the given optimization level. May be used multiple times to compile for multiple levels at a time (for example, ``compileall -o 1 -o 2``). -.. cmdoption:: -e dir +.. option:: -e dir Ignore symlinks pointing outside the given directory. -.. cmdoption:: --hardlink-dupes +.. option:: --hardlink-dupes If two ``.pyc`` files with different optimization level have the same content, use hard links to consolidate duplicate files. diff --git a/Doc/library/gzip.rst b/Doc/library/gzip.rst index c28cfdc50e3989..4d2fa3d4012a1d 100644 --- a/Doc/library/gzip.rst +++ b/Doc/library/gzip.rst @@ -262,23 +262,23 @@ Once executed the :mod:`gzip` module keeps the input file(s). Command line options ^^^^^^^^^^^^^^^^^^^^ -.. cmdoption:: file +.. option:: file If *file* is not specified, read from :data:`sys.stdin`. -.. cmdoption:: --fast +.. option:: --fast Indicates the fastest compression method (less compression). -.. cmdoption:: --best +.. option:: --best Indicates the slowest compression method (best compression). -.. cmdoption:: -d, --decompress +.. option:: -d, --decompress Decompress the given file. -.. cmdoption:: -h, --help +.. option:: -h, --help Show the help message. diff --git a/Doc/library/inspect.rst b/Doc/library/inspect.rst index 11f2d701199771..8b2bfb322529b1 100644 --- a/Doc/library/inspect.rst +++ b/Doc/library/inspect.rst @@ -1570,6 +1570,6 @@ By default, accepts the name of a module and prints the source of that module. A class or function within the module can be printed instead by appended a colon and the qualified name of the target object. -.. cmdoption:: --details +.. option:: --details Print information about the specified object rather than the source code diff --git a/Doc/library/json.rst b/Doc/library/json.rst index 6c3059381776c9..e234fe92bc9995 100644 --- a/Doc/library/json.rst +++ b/Doc/library/json.rst @@ -703,7 +703,7 @@ specified, :data:`sys.stdin` and :data:`sys.stdout` will be used respectively: Command line options ^^^^^^^^^^^^^^^^^^^^ -.. cmdoption:: infile +.. option:: infile The JSON file to be validated or pretty-printed: @@ -723,36 +723,36 @@ Command line options If *infile* is not specified, read from :data:`sys.stdin`. -.. cmdoption:: outfile +.. option:: outfile Write the output of the *infile* to the given *outfile*. Otherwise, write it to :data:`sys.stdout`. -.. cmdoption:: --sort-keys +.. option:: --sort-keys Sort the output of dictionaries alphabetically by key. .. versionadded:: 3.5 -.. cmdoption:: --no-ensure-ascii +.. option:: --no-ensure-ascii Disable escaping of non-ascii characters, see :func:`json.dumps` for more information. .. versionadded:: 3.9 -.. cmdoption:: --json-lines +.. option:: --json-lines Parse every input line as separate JSON object. .. versionadded:: 3.8 -.. cmdoption:: --indent, --tab, --no-indent, --compact +.. option:: --indent, --tab, --no-indent, --compact Mutually exclusive options for whitespace control. .. versionadded:: 3.9 -.. cmdoption:: -h, --help +.. option:: -h, --help Show the help message. diff --git a/Doc/library/pickletools.rst b/Doc/library/pickletools.rst index 480f4a6d320815..c6ff4e6ed3d3b5 100644 --- a/Doc/library/pickletools.rst +++ b/Doc/library/pickletools.rst @@ -51,24 +51,24 @@ Command line options .. program:: pickletools -.. cmdoption:: -a, --annotate +.. option:: -a, --annotate Annotate each line with a short opcode description. -.. cmdoption:: -o, --output= +.. option:: -o, --output= Name of a file where the output should be written. -.. cmdoption:: -l, --indentlevel= +.. option:: -l, --indentlevel= The number of blanks by which to indent a new MARK level. -.. cmdoption:: -m, --memo +.. option:: -m, --memo When multiple objects are disassembled, preserve memo between disassemblies. -.. cmdoption:: -p, --preamble= +.. option:: -p, --preamble= When more than one pickle file are specified, print given preamble before each disassembly. diff --git a/Doc/library/py_compile.rst b/Doc/library/py_compile.rst index 69b93a3bdfcb26..7272f36d9a5568 100644 --- a/Doc/library/py_compile.rst +++ b/Doc/library/py_compile.rst @@ -138,13 +138,13 @@ not be compiled. .. program:: python -m py_compile -.. cmdoption:: ... - - +.. option:: ... + - Positional arguments are files to compile. If ``-`` is the only parameter, the list of files is taken from standard input. -.. cmdoption:: -q, --quiet +.. option:: -q, --quiet Suppress errors output. diff --git a/Doc/library/site.rst b/Doc/library/site.rst index c025af608e318b..ab4fa765bf63d5 100644 --- a/Doc/library/site.rst +++ b/Doc/library/site.rst @@ -266,11 +266,11 @@ If it is called without arguments, it will print the contents of :data:`USER_BASE` and whether the directory exists, then the same thing for :data:`USER_SITE`, and finally the value of :data:`ENABLE_USER_SITE`. -.. cmdoption:: --user-base +.. option:: --user-base Print the path to the user base directory. -.. cmdoption:: --user-site +.. option:: --user-site Print the path to the user site-packages directory. diff --git a/Doc/library/tarfile.rst b/Doc/library/tarfile.rst index 61a450cf88b0ac..76c8602b7537b7 100644 --- a/Doc/library/tarfile.rst +++ b/Doc/library/tarfile.rst @@ -1149,31 +1149,31 @@ For a list of the files in a tar archive, use the :option:`-l` option: Command-line options ~~~~~~~~~~~~~~~~~~~~ -.. cmdoption:: -l - --list +.. option:: -l + --list List files in a tarfile. -.. cmdoption:: -c ... - --create ... +.. option:: -c ... + --create ... Create tarfile from source files. -.. cmdoption:: -e [] - --extract [] +.. option:: -e [] + --extract [] Extract tarfile into the current directory if *output_dir* is not specified. -.. cmdoption:: -t - --test +.. option:: -t + --test Test whether the tarfile is valid or not. -.. cmdoption:: -v, --verbose +.. option:: -v, --verbose Verbose output. -.. cmdoption:: --filter +.. option:: --filter Specifies the *filter* for ``--extract``. See :ref:`tarfile-extraction-filter` for details. diff --git a/Doc/library/timeit.rst b/Doc/library/timeit.rst index d0abea38817234..c9afb7d802e84f 100644 --- a/Doc/library/timeit.rst +++ b/Doc/library/timeit.rst @@ -214,36 +214,36 @@ Where the following options are understood: .. program:: timeit -.. cmdoption:: -n N, --number=N +.. option:: -n N, --number=N how many times to execute 'statement' -.. cmdoption:: -r N, --repeat=N +.. option:: -r N, --repeat=N how many times to repeat the timer (default 5) -.. cmdoption:: -s S, --setup=S +.. option:: -s S, --setup=S statement to be executed once initially (default ``pass``) -.. cmdoption:: -p, --process +.. option:: -p, --process measure process time, not wallclock time, using :func:`time.process_time` instead of :func:`time.perf_counter`, which is the default .. versionadded:: 3.3 -.. cmdoption:: -u, --unit=U +.. option:: -u, --unit=U specify a time unit for timer output; can select ``nsec``, ``usec``, ``msec``, or ``sec`` .. versionadded:: 3.5 -.. cmdoption:: -v, --verbose +.. option:: -v, --verbose print raw timing results; repeat for more digits precision -.. cmdoption:: -h, --help +.. option:: -h, --help print a short usage message and exit diff --git a/Doc/library/tokenize.rst b/Doc/library/tokenize.rst index a903bff1da8f5e..6f6c4d5371bb7f 100644 --- a/Doc/library/tokenize.rst +++ b/Doc/library/tokenize.rst @@ -171,11 +171,11 @@ The following options are accepted: .. program:: tokenize -.. cmdoption:: -h, --help +.. option:: -h, --help show this help message and exit -.. cmdoption:: -e, --exact +.. option:: -e, --exact display token names using the exact type diff --git a/Doc/library/trace.rst b/Doc/library/trace.rst index 40cf198f1287d7..e9b59a6d186ba2 100644 --- a/Doc/library/trace.rst +++ b/Doc/library/trace.rst @@ -34,11 +34,11 @@ all Python modules imported during the execution into the current directory. .. program:: trace -.. cmdoption:: --help +.. option:: --help Display usage and exit. -.. cmdoption:: --version +.. option:: --version Display the version of the module and exit. @@ -56,28 +56,28 @@ the :option:`--trace <-t>` and :option:`--count <-c>` options. When .. program:: trace -.. cmdoption:: -c, --count +.. option:: -c, --count Produce a set of annotated listing files upon program completion that shows how many times each statement was executed. See also :option:`--coverdir <-C>`, :option:`--file <-f>` and :option:`--no-report <-R>` below. -.. cmdoption:: -t, --trace +.. option:: -t, --trace Display lines as they are executed. -.. cmdoption:: -l, --listfuncs +.. option:: -l, --listfuncs Display the functions executed by running the program. -.. cmdoption:: -r, --report +.. option:: -r, --report Produce an annotated list from an earlier program run that used the :option:`--count <-c>` and :option:`--file <-f>` option. This does not execute any code. -.. cmdoption:: -T, --trackcalls +.. option:: -T, --trackcalls Display the calling relationships exposed by running the program. @@ -86,33 +86,33 @@ Modifiers .. program:: trace -.. cmdoption:: -f, --file= +.. option:: -f, --file= Name of a file to accumulate counts over several tracing runs. Should be used with the :option:`--count <-c>` option. -.. cmdoption:: -C, --coverdir= +.. option:: -C, --coverdir= Directory where the report files go. The coverage report for ``package.module`` is written to file :file:`{dir}/{package}/{module}.cover`. -.. cmdoption:: -m, --missing +.. option:: -m, --missing When generating annotated listings, mark lines which were not executed with ``>>>>>>``. -.. cmdoption:: -s, --summary +.. option:: -s, --summary When using :option:`--count <-c>` or :option:`--report <-r>`, write a brief summary to stdout for each file processed. -.. cmdoption:: -R, --no-report +.. option:: -R, --no-report Do not generate annotated listings. This is useful if you intend to make several runs with :option:`--count <-c>`, and then produce a single set of annotated listings at the end. -.. cmdoption:: -g, --timing +.. option:: -g, --timing Prefix each line with the time since the program started. Only used while tracing. @@ -124,12 +124,12 @@ These options may be repeated multiple times. .. program:: trace -.. cmdoption:: --ignore-module= +.. option:: --ignore-module= Ignore each of the given module names and its submodules (if it is a package). The argument can be a list of names separated by a comma. -.. cmdoption:: --ignore-dir= +.. option:: --ignore-dir= Ignore all modules and packages in the named directory and subdirectories. The argument can be a list of directories separated by :data:`os.pathsep`. diff --git a/Doc/library/unittest.rst b/Doc/library/unittest.rst index 2c6316d66eadfc..c5826698bd35f5 100644 --- a/Doc/library/unittest.rst +++ b/Doc/library/unittest.rst @@ -206,13 +206,13 @@ Command-line options .. program:: unittest -.. cmdoption:: -b, --buffer +.. option:: -b, --buffer The standard output and standard error streams are buffered during the test run. Output during a passing test is discarded. Output is echoed normally on test fail or error and is added to the failure messages. -.. cmdoption:: -c, --catch +.. option:: -c, --catch :kbd:`Control-C` during the test run waits for the current test to end and then reports all the results so far. A second :kbd:`Control-C` raises the normal @@ -220,11 +220,11 @@ Command-line options See `Signal Handling`_ for the functions that provide this functionality. -.. cmdoption:: -f, --failfast +.. option:: -f, --failfast Stop the test run on the first error or failure. -.. cmdoption:: -k +.. option:: -k Only run test methods and classes that match the pattern or substring. This option may be used multiple times, in which case all test cases that @@ -240,7 +240,7 @@ Command-line options For example, ``-k foo`` matches ``foo_tests.SomeTest.test_something``, ``bar_tests.SomeTest.test_foo``, but not ``bar_tests.FooTest.test_something``. -.. cmdoption:: --locals +.. option:: --locals Show local variables in tracebacks. @@ -286,19 +286,19 @@ The ``discover`` sub-command has the following options: .. program:: unittest discover -.. cmdoption:: -v, --verbose +.. option:: -v, --verbose Verbose output -.. cmdoption:: -s, --start-directory directory +.. option:: -s, --start-directory directory Directory to start discovery (``.`` default) -.. cmdoption:: -p, --pattern pattern +.. option:: -p, --pattern pattern Pattern to match test files (``test*.py`` default) -.. cmdoption:: -t, --top-level-directory directory +.. option:: -t, --top-level-directory directory Top level directory of project (defaults to start directory) diff --git a/Doc/library/zipapp.rst b/Doc/library/zipapp.rst index 81d61d8fe853d3..c98d8a6036bf83 100644 --- a/Doc/library/zipapp.rst +++ b/Doc/library/zipapp.rst @@ -54,7 +54,7 @@ The following options are understood: .. program:: zipapp -.. cmdoption:: -o , --output= +.. option:: -o , --output= Write the output to a file named *output*. If this option is not specified, the output filename will be the same as the input *source*, with the @@ -64,13 +64,13 @@ The following options are understood: An output filename must be specified if the *source* is an archive (and in that case, *output* must not be the same as *source*). -.. cmdoption:: -p , --python= +.. option:: -p , --python= Add a ``#!`` line to the archive specifying *interpreter* as the command to run. Also, on POSIX, make the archive executable. The default is to write no ``#!`` line, and not make the file executable. -.. cmdoption:: -m , --main= +.. option:: -m , --main= Write a ``__main__.py`` file to the archive that executes *mainfn*. The *mainfn* argument should have the form "pkg.mod:fn", where "pkg.mod" is a @@ -79,7 +79,7 @@ The following options are understood: :option:`--main` cannot be specified when copying an archive. -.. cmdoption:: -c, --compress +.. option:: -c, --compress Compress files with the deflate method, reducing the size of the output file. By default, files are stored uncompressed in the archive. @@ -88,13 +88,13 @@ The following options are understood: .. versionadded:: 3.7 -.. cmdoption:: --info +.. option:: --info Display the interpreter embedded in the archive, for diagnostic purposes. In this case, any other options are ignored and SOURCE must be an archive, not a directory. -.. cmdoption:: -h, --help +.. option:: -h, --help Print a short usage message and exit. diff --git a/Doc/library/zipfile.rst b/Doc/library/zipfile.rst index f4867575447c0e..ef84115a4b3c13 100644 --- a/Doc/library/zipfile.rst +++ b/Doc/library/zipfile.rst @@ -904,27 +904,27 @@ For a list of the files in a ZIP archive, use the :option:`-l` option: Command-line options ~~~~~~~~~~~~~~~~~~~~ -.. cmdoption:: -l - --list +.. option:: -l + --list List files in a zipfile. -.. cmdoption:: -c ... - --create ... +.. option:: -c ... + --create ... Create zipfile from source files. -.. cmdoption:: -e - --extract +.. option:: -e + --extract Extract zipfile into target directory. -.. cmdoption:: -t - --test +.. option:: -t + --test Test whether the zipfile is valid or not. -.. cmdoption:: --metadata-encoding +.. option:: --metadata-encoding Specify encoding of member names for :option:`-l`, :option:`-e` and :option:`-t`. diff --git a/Doc/using/cmdline.rst b/Doc/using/cmdline.rst index c42ef78c6b09d3..79aefdaf389de5 100644 --- a/Doc/using/cmdline.rst +++ b/Doc/using/cmdline.rst @@ -59,7 +59,7 @@ all consecutive arguments will end up in :data:`sys.argv` -- note that the first element, subscript zero (``sys.argv[0]``), is a string reflecting the program's source. -.. cmdoption:: -c +.. option:: -c Execute the Python code in *command*. *command* can be one or more statements separated by newlines, with significant leading whitespace as in @@ -72,7 +72,7 @@ source. .. audit-event:: cpython.run_command command cmdoption-c -.. cmdoption:: -m +.. option:: -m Search :data:`sys.path` for the named module and execute its contents as the :mod:`__main__` module. @@ -188,35 +188,35 @@ automatically enabled, if available on your platform (see Generic options ~~~~~~~~~~~~~~~ -.. cmdoption:: -? - -h - --help +.. option:: -? + -h + --help Print a short description of all command line options and corresponding environment variables and exit. -.. cmdoption:: --help-env +.. option:: --help-env Print a short description of Python-specific environment variables and exit. .. versionadded:: 3.11 -.. cmdoption:: --help-xoptions +.. option:: --help-xoptions Print a description of implementation-specific :option:`-X` options and exit. .. versionadded:: 3.11 -.. cmdoption:: --help-all +.. option:: --help-all Print complete usage information and exit. .. versionadded:: 3.11 -.. cmdoption:: -V - --version +.. option:: -V + --version Print the Python version number and exit. Example output could be: @@ -240,7 +240,7 @@ Generic options Miscellaneous options ~~~~~~~~~~~~~~~~~~~~~ -.. cmdoption:: -b +.. option:: -b Issue a warning when comparing :class:`bytes` or :class:`bytearray` with :class:`str` or :class:`bytes` with :class:`int`. Issue an error when the @@ -249,13 +249,13 @@ Miscellaneous options .. versionchanged:: 3.5 Affects comparisons of :class:`bytes` with :class:`int`. -.. cmdoption:: -B +.. option:: -B If given, Python won't try to write ``.pyc`` files on the import of source modules. See also :envvar:`PYTHONDONTWRITEBYTECODE`. -.. cmdoption:: --check-hash-based-pycs default|always|never +.. option:: --check-hash-based-pycs default|always|never Control the validation behavior of hash-based ``.pyc`` files. See :ref:`pyc-invalidation`. When set to ``default``, checked and unchecked @@ -269,13 +269,13 @@ Miscellaneous options option. -.. cmdoption:: -d +.. option:: -d Turn on parser debugging output (for expert only, depending on compilation options). See also :envvar:`PYTHONDEBUG`. -.. cmdoption:: -E +.. option:: -E Ignore all :envvar:`PYTHON*` environment variables, e.g. :envvar:`PYTHONPATH` and :envvar:`PYTHONHOME`, that might be set. @@ -283,7 +283,7 @@ Miscellaneous options See also the :option:`-P` and :option:`-I` (isolated) options. -.. cmdoption:: -i +.. option:: -i When a script is passed as first argument or the :option:`-c` option is used, enter interactive mode after executing the script or the command, even when @@ -294,7 +294,7 @@ Miscellaneous options raises an exception. See also :envvar:`PYTHONINSPECT`. -.. cmdoption:: -I +.. option:: -I Run Python in isolated mode. This also implies :option:`-E`, :option:`-P` and :option:`-s` options. @@ -307,7 +307,7 @@ Miscellaneous options .. versionadded:: 3.4 -.. cmdoption:: -O +.. option:: -O Remove assert statements and any code conditional on the value of :const:`__debug__`. Augment the filename for compiled @@ -318,7 +318,7 @@ Miscellaneous options Modify ``.pyc`` filenames according to :pep:`488`. -.. cmdoption:: -OO +.. option:: -OO Do :option:`-O` and also discard docstrings. Augment the filename for compiled (:term:`bytecode`) files by adding ``.opt-2`` before the @@ -328,7 +328,7 @@ Miscellaneous options Modify ``.pyc`` filenames according to :pep:`488`. -.. cmdoption:: -P +.. option:: -P Don't prepend a potentially unsafe path to :data:`sys.path`: @@ -345,14 +345,14 @@ Miscellaneous options .. versionadded:: 3.11 -.. cmdoption:: -q +.. option:: -q Don't display the copyright and version messages even in interactive mode. .. versionadded:: 3.2 -.. cmdoption:: -R +.. option:: -R Turn on hash randomization. This option only has an effect if the :envvar:`PYTHONHASHSEED` environment variable is set to ``0``, since hash @@ -378,7 +378,7 @@ Miscellaneous options .. versionadded:: 3.2.3 -.. cmdoption:: -s +.. option:: -s Don't add the :data:`user site-packages directory ` to :data:`sys.path`. @@ -388,7 +388,7 @@ Miscellaneous options :pep:`370` -- Per user site-packages directory -.. cmdoption:: -S +.. option:: -S Disable the import of the module :mod:`site` and the site-dependent manipulations of :data:`sys.path` that it entails. Also disable these @@ -396,7 +396,7 @@ Miscellaneous options :func:`site.main` if you want them to be triggered). -.. cmdoption:: -u +.. option:: -u Force the stdout and stderr streams to be unbuffered. This option has no effect on the stdin stream. @@ -407,7 +407,7 @@ Miscellaneous options The text layer of the stdout and stderr streams now is unbuffered. -.. cmdoption:: -v +.. option:: -v Print a message each time a module is initialized, showing the place (filename or built-in module) from which it is loaded. When given twice @@ -422,7 +422,7 @@ Miscellaneous options .. _using-on-warnings: -.. cmdoption:: -W arg +.. option:: -W arg Warning control. Python's warning machinery by default prints warning messages to :data:`sys.stderr`. @@ -481,13 +481,13 @@ Miscellaneous options details. -.. cmdoption:: -x +.. option:: -x Skip the first line of the source, allowing use of non-Unix forms of ``#!cmd``. This is intended for a DOS specific hack only. -.. cmdoption:: -X +.. option:: -X Reserved for various implementation-specific options. CPython currently defines the following possible values: @@ -586,7 +586,7 @@ Miscellaneous options Options you shouldn't use ~~~~~~~~~~~~~~~~~~~~~~~~~ -.. cmdoption:: -J +.. option:: -J Reserved for use by Jython_. diff --git a/Doc/using/configure.rst b/Doc/using/configure.rst index 04cd3769f22632..2d9a374ba26708 100644 --- a/Doc/using/configure.rst +++ b/Doc/using/configure.rst @@ -16,7 +16,7 @@ See also the :file:`Misc/SpecialBuilds.txt` in the Python source distribution. General Options --------------- -.. cmdoption:: --enable-loadable-sqlite-extensions +.. option:: --enable-loadable-sqlite-extensions Support loadable extensions in the :mod:`!_sqlite` extension module (default is no) of the :mod:`sqlite3` module. @@ -26,12 +26,12 @@ General Options .. versionadded:: 3.6 -.. cmdoption:: --disable-ipv6 +.. option:: --disable-ipv6 Disable IPv6 support (enabled by default if supported), see the :mod:`socket` module. -.. cmdoption:: --enable-big-digits=[15|30] +.. option:: --enable-big-digits=[15|30] Define the size in bits of Python :class:`int` digits: 15 or 30 bits. @@ -41,13 +41,13 @@ General Options See :data:`sys.int_info.bits_per_digit `. -.. cmdoption:: --with-cxx-main -.. cmdoption:: --with-cxx-main=COMPILER +.. option:: --with-cxx-main +.. option:: --with-cxx-main=COMPILER Compile the Python ``main()`` function and link Python executable with C++ compiler: ``$CXX``, or *COMPILER* if specified. -.. cmdoption:: --with-suffix=SUFFIX +.. option:: --with-suffix=SUFFIX Set the Python executable suffix to *SUFFIX*. @@ -60,7 +60,7 @@ General Options The default suffix on WASM platform is one of ``.js``, ``.html`` or ``.wasm``. -.. cmdoption:: --with-tzpath= +.. option:: --with-tzpath= Select the default time zone search path for :const:`zoneinfo.TZPATH`. See the :ref:`Compile-time configuration @@ -72,7 +72,7 @@ General Options .. versionadded:: 3.9 -.. cmdoption:: --without-decimal-contextvar +.. option:: --without-decimal-contextvar Build the ``_decimal`` extension module using a thread-local context rather than a coroutine-local context (default), see the :mod:`decimal` module. @@ -81,7 +81,7 @@ General Options .. versionadded:: 3.9 -.. cmdoption:: --with-dbmliborder= +.. option:: --with-dbmliborder= Override order to check db backends for the :mod:`dbm` module @@ -91,7 +91,7 @@ General Options * ``gdbm``; * ``bdb``. -.. cmdoption:: --without-c-locale-coercion +.. option:: --without-c-locale-coercion Disable C locale coercion to a UTF-8 based locale (enabled by default). @@ -99,7 +99,7 @@ General Options See :envvar:`PYTHONCOERCECLOCALE` and the :pep:`538`. -.. cmdoption:: --with-platlibdir=DIRNAME +.. option:: --with-platlibdir=DIRNAME Python library directory name (default is ``lib``). @@ -109,7 +109,7 @@ General Options .. versionadded:: 3.9 -.. cmdoption:: --with-wheel-pkg-dir=PATH +.. option:: --with-wheel-pkg-dir=PATH Directory of wheel packages used by the :mod:`ensurepip` module (none by default). @@ -121,7 +121,7 @@ General Options .. versionadded:: 3.10 -.. cmdoption:: --with-pkg-config=[check|yes|no] +.. option:: --with-pkg-config=[check|yes|no] Whether configure should use :program:`pkg-config` to detect build dependencies. @@ -132,7 +132,7 @@ General Options .. versionadded:: 3.11 -.. cmdoption:: --enable-pystats +.. option:: --enable-pystats Turn on internal statistics gathering. @@ -146,7 +146,7 @@ General Options WebAssembly Options ------------------- -.. cmdoption:: --with-emscripten-target=[browser|node] +.. option:: --with-emscripten-target=[browser|node] Set build flavor for ``wasm32-emscripten``. @@ -155,7 +155,7 @@ WebAssembly Options .. versionadded:: 3.11 -.. cmdoption:: --enable-wasm-dynamic-linking +.. option:: --enable-wasm-dynamic-linking Turn on dynamic linking support for WASM. @@ -164,7 +164,7 @@ WebAssembly Options .. versionadded:: 3.11 -.. cmdoption:: --enable-wasm-pthreads +.. option:: --enable-wasm-pthreads Turn on pthreads support for WASM. @@ -174,7 +174,7 @@ WebAssembly Options Install Options --------------- -.. cmdoption:: --prefix=PREFIX +.. option:: --prefix=PREFIX Install architecture-independent files in PREFIX. On Unix, it defaults to :file:`/usr/local`. @@ -184,20 +184,20 @@ Install Options As an example, one can use ``--prefix="$HOME/.local/"`` to install a Python in its home directory. -.. cmdoption:: --exec-prefix=EPREFIX +.. option:: --exec-prefix=EPREFIX Install architecture-dependent files in EPREFIX, defaults to :option:`--prefix`. This value can be retrieved at runtime using :data:`sys.exec_prefix`. -.. cmdoption:: --disable-test-modules +.. option:: --disable-test-modules Don't build nor install test modules, like the :mod:`test` package or the :mod:`!_testcapi` extension module (built and installed by default). .. versionadded:: 3.10 -.. cmdoption:: --with-ensurepip=[upgrade|install|no] +.. option:: --with-ensurepip=[upgrade|install|no] Select the :mod:`ensurepip` command run on Python installation: @@ -215,7 +215,7 @@ Performance options Configuring Python using ``--enable-optimizations --with-lto`` (PGO + LTO) is recommended for best performance. -.. cmdoption:: --enable-optimizations +.. option:: --enable-optimizations Enable Profile Guided Optimization (PGO) using :envvar:`PROFILE_TASK` (disabled by default). @@ -241,7 +241,7 @@ recommended for best performance. .. versionadded:: 3.8 -.. cmdoption:: --with-lto=[full|thin|no|yes] +.. option:: --with-lto=[full|thin|no|yes] Enable Link Time Optimization (LTO) in any build (disabled by default). @@ -253,19 +253,19 @@ recommended for best performance. .. versionadded:: 3.11 To use ThinLTO feature, use ``--with-lto=thin`` on Clang. -.. cmdoption:: --with-computed-gotos +.. option:: --with-computed-gotos Enable computed gotos in evaluation loop (enabled by default on supported compilers). -.. cmdoption:: --without-pymalloc +.. option:: --without-pymalloc Disable the specialized Python memory allocator :ref:`pymalloc ` (enabled by default). See also :envvar:`PYTHONMALLOC` environment variable. -.. cmdoption:: --without-doc-strings +.. option:: --without-doc-strings Disable static documentation strings to reduce the memory footprint (enabled by default). Documentation strings defined in Python are not affected. @@ -274,7 +274,7 @@ recommended for best performance. See the ``PyDoc_STRVAR()`` macro. -.. cmdoption:: --enable-profiling +.. option:: --enable-profiling Enable C-level code profiling with ``gprof`` (disabled by default). @@ -329,12 +329,12 @@ See also the :ref:`Python Development Mode ` and the Debug options ------------- -.. cmdoption:: --with-pydebug +.. option:: --with-pydebug :ref:`Build Python in debug mode `: define the ``Py_DEBUG`` macro (disabled by default). -.. cmdoption:: --with-trace-refs +.. option:: --with-trace-refs Enable tracing references for debugging purpose (disabled by default). @@ -349,7 +349,7 @@ Debug options .. versionadded:: 3.8 -.. cmdoption:: --with-assertions +.. option:: --with-assertions Build with C assertions enabled (default is no): ``assert(...);`` and ``_PyObject_ASSERT(...);``. @@ -362,11 +362,11 @@ Debug options .. versionadded:: 3.6 -.. cmdoption:: --with-valgrind +.. option:: --with-valgrind Enable Valgrind support (default is no). -.. cmdoption:: --with-dtrace +.. option:: --with-dtrace Enable DTrace support (default is no). @@ -375,19 +375,19 @@ Debug options .. versionadded:: 3.6 -.. cmdoption:: --with-address-sanitizer +.. option:: --with-address-sanitizer Enable AddressSanitizer memory error detector, ``asan`` (default is no). .. versionadded:: 3.6 -.. cmdoption:: --with-memory-sanitizer +.. option:: --with-memory-sanitizer Enable MemorySanitizer allocation error detector, ``msan`` (default is no). .. versionadded:: 3.6 -.. cmdoption:: --with-undefined-behavior-sanitizer +.. option:: --with-undefined-behavior-sanitizer Enable UndefinedBehaviorSanitizer undefined behaviour detector, ``ubsan`` (default is no). @@ -398,11 +398,11 @@ Debug options Linker options -------------- -.. cmdoption:: --enable-shared +.. option:: --enable-shared Enable building a shared Python library: ``libpython`` (default is no). -.. cmdoption:: --without-static-libpython +.. option:: --without-static-libpython Do not build ``libpythonMAJOR.MINOR.a`` and do not install ``python.o`` (built and enabled by default). @@ -413,28 +413,28 @@ Linker options Libraries options ----------------- -.. cmdoption:: --with-libs='lib1 ...' +.. option:: --with-libs='lib1 ...' Link against additional libraries (default is no). -.. cmdoption:: --with-system-expat +.. option:: --with-system-expat Build the :mod:`!pyexpat` module using an installed ``expat`` library (default is no). -.. cmdoption:: --with-system-ffi +.. option:: --with-system-ffi Build the :mod:`_ctypes` extension module using an installed ``ffi`` library, see the :mod:`ctypes` module (default is system-dependent). -.. cmdoption:: --with-system-libmpdec +.. option:: --with-system-libmpdec Build the ``_decimal`` extension module using an installed ``mpdec`` library, see the :mod:`decimal` module (default is no). .. versionadded:: 3.3 -.. cmdoption:: --with-readline=editline +.. option:: --with-readline=editline Use ``editline`` library for backend of the :mod:`readline` module. @@ -442,7 +442,7 @@ Libraries options .. versionadded:: 3.10 -.. cmdoption:: --without-readline +.. option:: --without-readline Don't build the :mod:`readline` module (built by default). @@ -450,21 +450,21 @@ Libraries options .. versionadded:: 3.10 -.. cmdoption:: --with-libm=STRING +.. option:: --with-libm=STRING Override ``libm`` math library to *STRING* (default is system-dependent). -.. cmdoption:: --with-libc=STRING +.. option:: --with-libc=STRING Override ``libc`` C library to *STRING* (default is system-dependent). -.. cmdoption:: --with-openssl=DIR +.. option:: --with-openssl=DIR Root of the OpenSSL directory. .. versionadded:: 3.7 -.. cmdoption:: --with-openssl-rpath=[no|auto|DIR] +.. option:: --with-openssl-rpath=[no|auto|DIR] Set runtime library directory (rpath) for OpenSSL libraries: @@ -479,7 +479,7 @@ Libraries options Security Options ---------------- -.. cmdoption:: --with-hash-algorithm=[fnv|siphash13|siphash24] +.. option:: --with-hash-algorithm=[fnv|siphash13|siphash24] Select hash algorithm for use in ``Python/pyhash.c``: @@ -492,7 +492,7 @@ Security Options .. versionadded:: 3.11 ``siphash13`` is added and it is the new default. -.. cmdoption:: --with-builtin-hashlib-hashes=md5,sha1,sha256,sha512,sha3,blake2 +.. option:: --with-builtin-hashlib-hashes=md5,sha1,sha256,sha512,sha3,blake2 Built-in hash modules: @@ -505,7 +505,7 @@ Security Options .. versionadded:: 3.9 -.. cmdoption:: --with-ssl-default-suites=[python|openssl|STRING] +.. option:: --with-ssl-default-suites=[python|openssl|STRING] Override the OpenSSL default cipher suites string: @@ -527,19 +527,19 @@ macOS Options See ``Mac/README.rst``. -.. cmdoption:: --enable-universalsdk -.. cmdoption:: --enable-universalsdk=SDKDIR +.. option:: --enable-universalsdk +.. option:: --enable-universalsdk=SDKDIR Create a universal binary build. *SDKDIR* specifies which macOS SDK should be used to perform the build (default is no). -.. cmdoption:: --enable-framework -.. cmdoption:: --enable-framework=INSTALLDIR +.. option:: --enable-framework +.. option:: --enable-framework=INSTALLDIR Create a Python.framework rather than a traditional Unix install. Optional *INSTALLDIR* specifies the installation path (default is no). -.. cmdoption:: --with-universal-archs=ARCH +.. option:: --with-universal-archs=ARCH Specify the kind of universal binary that should be created. This option is only valid when :option:`--enable-universalsdk` is set. @@ -555,7 +555,7 @@ See ``Mac/README.rst``. * ``intel-64``; * ``all``. -.. cmdoption:: --with-framework-name=FRAMEWORK +.. option:: --with-framework-name=FRAMEWORK Specify the name for the python framework on macOS only valid when :option:`--enable-framework` is set (default: ``Python``). @@ -569,21 +569,21 @@ for another CPU architecture or platform. Cross compiling requires a Python interpreter for the build platform. The version of the build Python must match the version of the cross compiled host Python. -.. cmdoption:: --build=BUILD +.. option:: --build=BUILD configure for building on BUILD, usually guessed by :program:`config.guess`. -.. cmdoption:: --host=HOST +.. option:: --host=HOST cross-compile to build programs to run on HOST (target platform) -.. cmdoption:: --with-build-python=path/to/python +.. option:: --with-build-python=path/to/python path to build ``python`` binary for cross compiling .. versionadded:: 3.11 -.. cmdoption:: CONFIG_SITE=file +.. option:: CONFIG_SITE=file An environment variable that points to a file with configure overrides. From 8da3367067f3e22d2369a00640185525813520fb Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 5 Oct 2023 12:18:44 -0700 Subject: [PATCH 021/265] [3.11] gh-110167: Fix test_socket deadlock in doCleanups() (GH-110416) (#110424) gh-110167: Fix test_socket deadlock in doCleanups() (GH-110416) Fix a deadlock in test_socket when server fails with a timeout but the client is still running in its thread. Don't hold a lock to call cleanup functions in doCleanups(). One of the cleanup function waits until the client completes, whereas the client could deadlock if it called addCleanup() in such situation. doCleanups() is called when the server completed, but the client can still be running in its thread especially if the server failed with a timeout. Don't put a lock on doCleanups() to prevent deadlock between addCleanup() called in the client and doCleanups() waiting for self.done.wait of ThreadableTest._setUp(). (cherry picked from commit 318f5df27109ff8d2519edefa771920a0ec62b92) Co-authored-by: Victor Stinner --- Lib/test/test_socket.py | 12 +++++++----- .../2023-10-05-19-33-49.gh-issue-110167.mIdj3v.rst | 5 +++++ 2 files changed, 12 insertions(+), 5 deletions(-) create mode 100644 Misc/NEWS.d/next/Tests/2023-10-05-19-33-49.gh-issue-110167.mIdj3v.rst diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py index 190efb416ba39a..bc93d9988eaf03 100644 --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -204,8 +204,13 @@ def setUp(self): class ThreadSafeCleanupTestCase: """Subclass of unittest.TestCase with thread-safe cleanup methods. - This subclass protects the addCleanup() and doCleanups() methods - with a recursive lock. + This subclass protects the addCleanup() method with a recursive lock. + + doCleanups() is called when the server completed, but the client can still + be running in its thread especially if the server failed with a timeout. + Don't put a lock on doCleanups() to prevent deadlock between addCleanup() + called in the client and doCleanups() waiting for self.done.wait of + ThreadableTest._setUp() (gh-110167) """ def __init__(self, *args, **kwargs): @@ -216,9 +221,6 @@ def addCleanup(self, *args, **kwargs): with self._cleanup_lock: return super().addCleanup(*args, **kwargs) - def doCleanups(self, *args, **kwargs): - with self._cleanup_lock: - return super().doCleanups(*args, **kwargs) class SocketCANTest(unittest.TestCase): diff --git a/Misc/NEWS.d/next/Tests/2023-10-05-19-33-49.gh-issue-110167.mIdj3v.rst b/Misc/NEWS.d/next/Tests/2023-10-05-19-33-49.gh-issue-110167.mIdj3v.rst new file mode 100644 index 00000000000000..d0cbbf9c3788bb --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-10-05-19-33-49.gh-issue-110167.mIdj3v.rst @@ -0,0 +1,5 @@ +Fix a deadlock in test_socket when server fails with a timeout but the +client is still running in its thread. Don't hold a lock to call cleanup +functions in doCleanups(). One of the cleanup function waits until the +client completes, whereas the client could deadlock if it called +addCleanup() in such situation. Patch by Victor Stinner. From a503bdf21fefb61999d1b4afdd288156b3138655 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 5 Oct 2023 12:53:14 -0700 Subject: [PATCH 022/265] [3.11] gh-109840: Fix multiprocessing test_waitfor_timeout() (GH-110428) (#110431) gh-109840: Fix multiprocessing test_waitfor_timeout() (GH-110428) Don't measure the CI performance: don't fail if cond.wait_for() takes longer than 1 second on a slow CI. (cherry picked from commit 5eae8dc2cb832af6ae1ee340fb0194107fe3bd6e) Co-authored-by: Victor Stinner --- Lib/test/_test_multiprocessing.py | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/Lib/test/_test_multiprocessing.py b/Lib/test/_test_multiprocessing.py index 6459757eb93d51..42441fed79bf3e 100644 --- a/Lib/test/_test_multiprocessing.py +++ b/Lib/test/_test_multiprocessing.py @@ -1650,12 +1650,12 @@ def test_waitfor(self): def _test_waitfor_timeout_f(cls, cond, state, success, sem): sem.release() with cond: - expected = 0.1 + expected = 0.100 dt = time.monotonic() result = cond.wait_for(lambda : state.value==4, timeout=expected) dt = time.monotonic() - dt # borrow logic in assertTimeout() from test/lock_tests.py - if not result and expected * 0.6 < dt < expected * 10.0: + if not result and expected * 0.6 <= dt: success.value = True @unittest.skipUnless(HAS_SHAREDCTYPES, 'needs sharedctypes') @@ -1674,7 +1674,7 @@ def test_waitfor_timeout(self): # Only increment 3 times, so state == 4 is never reached. for i in range(3): - time.sleep(0.01) + time.sleep(0.010) with cond: state.value += 1 cond.notify() From a55c203104cedfa5f5c600671607ee2f31425690 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 5 Oct 2023 13:14:32 -0700 Subject: [PATCH 023/265] [3.11] gh-110383: Swap 'the all' -> 'all the' in socket docs (GH-110434) (#110436) Co-authored-by: Bradley Reynolds --- Doc/library/socket.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/library/socket.rst b/Doc/library/socket.rst index 36151ca650e52e..32afacbbd231a7 100644 --- a/Doc/library/socket.rst +++ b/Doc/library/socket.rst @@ -2001,7 +2001,7 @@ The next two examples are identical to the above two, but support both IPv4 and IPv6. The server side will listen to the first address family available (it should listen to both instead). On most of IPv6-ready systems, IPv6 will take precedence and the server may not accept IPv4 traffic. The client side will try -to connect to the all addresses returned as a result of the name resolution, and +to connect to all the addresses returned as a result of the name resolution, and sends traffic to the first one connected successfully. :: # Echo server program From 779481ef15ea44bde76a45e1ddfcbc8b9a39809c Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 5 Oct 2023 13:49:32 -0700 Subject: [PATCH 024/265] [3.11] gh-110429: Fix race condition in "make regen-all" (GH-110433) (#110439) gh-110429: Fix race condition in "make regen-all" (GH-110433) "make regen-pegen" now creates a temporary file called "parser.c.new" instead of "parser.new.c". Previously, if "make clinic" was run in parallel with "make regen-all", clinic may try but fail to open "parser.new.c" if the temporay file was removed in the meanwhile. (cherry picked from commit fb6c4ed2bbb2a867d5f0b9a94656e4714be5d9c2) Co-authored-by: Victor Stinner --- Makefile.pre.in | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/Makefile.pre.in b/Makefile.pre.in index ba8e6349e815de..f84ab130ddbd2e 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -1301,8 +1301,8 @@ regen-pegen: PYTHONPATH=$(srcdir)/Tools/peg_generator $(PYTHON_FOR_REGEN) -m pegen -q c \ $(srcdir)/Grammar/python.gram \ $(srcdir)/Grammar/Tokens \ - -o $(srcdir)/Parser/parser.new.c - $(UPDATE_FILE) $(srcdir)/Parser/parser.c $(srcdir)/Parser/parser.new.c + -o $(srcdir)/Parser/parser.c.new + $(UPDATE_FILE) $(srcdir)/Parser/parser.c $(srcdir)/Parser/parser.c.new .PHONY=regen-ast regen-ast: From 331d90f30bcdefce6647c56126ad019a42499f15 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 5 Oct 2023 14:10:01 -0700 Subject: [PATCH 025/265] [3.11] [3.12] gh-110167: Increase support.LOOPBACK_TIMEOUT to 10 seconds (GH-110413) (GH-110427) (#110440) [3.12] gh-110167: Increase support.LOOPBACK_TIMEOUT to 10 seconds (GH-110413) (GH-110427) gh-110167: Increase support.LOOPBACK_TIMEOUT to 10 seconds (GH-110413) Increase support.LOOPBACK_TIMEOUT from 5 to 10 seconds. Also increase the timeout depending on the --timeout option. For example, for a test timeout of 40 minutes (ARM Raspbian 3.x), use LOOPBACK_TIMEOUT of 20 seconds instead of 5 seconds before. (cherry picked from commit 350d89b79588ebd140c3987cc05e3719ca17a973) Co-authored-by: Victor Stinner (cherry picked from commit 0db2f1475e6539e1954e1f8bd53e005c3ecd6a26) Co-authored-by: Victor Stinner --- Lib/test/libregrtest/setup.py | 17 ++++++++++------- Lib/test/support/__init__.py | 8 +------- 2 files changed, 11 insertions(+), 14 deletions(-) diff --git a/Lib/test/libregrtest/setup.py b/Lib/test/libregrtest/setup.py index c097d6d597aa05..ecd7fa33107311 100644 --- a/Lib/test/libregrtest/setup.py +++ b/Lib/test/libregrtest/setup.py @@ -88,16 +88,19 @@ def _test_audit_hook(name, args): setup_unraisable_hook() setup_threading_excepthook() - if ns.timeout is not None: + timeout = ns.timeout + if timeout is not None: # For a slow buildbot worker, increase SHORT_TIMEOUT and LONG_TIMEOUT - support.SHORT_TIMEOUT = max(support.SHORT_TIMEOUT, ns.timeout / 40) - support.LONG_TIMEOUT = max(support.LONG_TIMEOUT, ns.timeout / 4) + support.LOOPBACK_TIMEOUT = max(support.LOOPBACK_TIMEOUT, timeout / 120) + # don't increase INTERNET_TIMEOUT + support.SHORT_TIMEOUT = max(support.SHORT_TIMEOUT, timeout / 40) + support.LONG_TIMEOUT = max(support.LONG_TIMEOUT, timeout / 4) # If --timeout is short: reduce timeouts - support.LOOPBACK_TIMEOUT = min(support.LOOPBACK_TIMEOUT, ns.timeout) - support.INTERNET_TIMEOUT = min(support.INTERNET_TIMEOUT, ns.timeout) - support.SHORT_TIMEOUT = min(support.SHORT_TIMEOUT, ns.timeout) - support.LONG_TIMEOUT = min(support.LONG_TIMEOUT, ns.timeout) + support.LOOPBACK_TIMEOUT = min(support.LOOPBACK_TIMEOUT, timeout) + support.INTERNET_TIMEOUT = min(support.INTERNET_TIMEOUT, timeout) + support.SHORT_TIMEOUT = min(support.SHORT_TIMEOUT, timeout) + support.LONG_TIMEOUT = min(support.LONG_TIMEOUT, timeout) if ns.xmlpath: from test.support.testresult import RegressionTestResult diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index fbbd59d719d193..9da0c8e5c355ba 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -72,13 +72,7 @@ # # The timeout should be long enough for connect(), recv() and send() methods # of socket.socket. -LOOPBACK_TIMEOUT = 5.0 -if sys.platform == 'win32' and ' 32 bit (ARM)' in sys.version: - # bpo-37553: test_socket.SendfileUsingSendTest is taking longer than 2 - # seconds on Windows ARM32 buildbot - LOOPBACK_TIMEOUT = 10 -elif sys.platform == 'vxworks': - LOOPBACK_TIMEOUT = 10 +LOOPBACK_TIMEOUT = 10.0 # Timeout in seconds for network requests going to the internet. The timeout is # short enough to prevent a test to wait for too long if the internet request From 4134036ce9673b76cc8e6236361a77098ee72ad1 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 5 Oct 2023 15:05:20 -0700 Subject: [PATCH 026/265] [3.11] gh-110393: Remove watchdog with hardcoded timeout (GH-110400) (#110444) gh-110393: Remove watchdog with hardcoded timeout (GH-110400) test_builtin and test_socketserver no longer use signal.alarm() to implement a watchdog with a hardcoded timeout (2 and 60 seconds). Python test runner regrtest has two watchdogs: faulthandler and timeout on running worker processes. Tests using short hardcoded timeout can fail on slowest buildbots just because the timeout is too short. (cherry picked from commit 1328fa31fe9c72748fc6fd11d017c82aafd48a49) Co-authored-by: Victor Stinner --- Lib/test/test_builtin.py | 2 -- Lib/test/test_socketserver.py | 7 ------- 2 files changed, 9 deletions(-) diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py index 2f56121616629e..d6edc2b431d32a 100644 --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -2146,8 +2146,6 @@ def _run_child(self, child, terminal_input): if pid == 0: # Child try: - # Make sure we don't get stuck if there's a problem - signal.alarm(2) os.close(r) with open(w, "w") as wpipe: child(wpipe) diff --git a/Lib/test/test_socketserver.py b/Lib/test/test_socketserver.py index 2edb1e0c0e21e2..80e1968c0bf676 100644 --- a/Lib/test/test_socketserver.py +++ b/Lib/test/test_socketserver.py @@ -33,11 +33,6 @@ HAVE_FORKING = test.support.has_fork_support requires_forking = unittest.skipUnless(HAVE_FORKING, 'requires forking') -def signal_alarm(n): - """Call signal.alarm when it exists (i.e. not on Windows).""" - if hasattr(signal, 'alarm'): - signal.alarm(n) - # Remember real select() to avoid interferences with mocking _real_select = select.select @@ -77,12 +72,10 @@ class SocketServerTest(unittest.TestCase): """Test all socket servers.""" def setUp(self): - signal_alarm(60) # Kill deadlocks after 60 seconds. self.port_seed = 0 self.test_files = [] def tearDown(self): - signal_alarm(0) # Didn't deadlock. reap_children() for fn in self.test_files: From 67129c379cdfc7e880bc1cfdef3df1f562c94853 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 5 Oct 2023 15:06:28 -0700 Subject: [PATCH 027/265] [3.11] gh-109888: Fix test_os _kill_with_event() on Windows (GH-110421) (#110443) gh-109888: Fix test_os _kill_with_event() on Windows (GH-110421) Replace os.kill() with proc.kill() which catchs PermissionError. Rewrite _kill_with_event(): * Use subprocess context manager ("with proc:"). * Use sleeping_retry() to wait until the child process is ready. * Replace SIGINT with proc.kill() on error. * Replace 10 seconds with SHORT_TIMEOUT to wait until the process is ready. * Replace 0.5 seconds with SHORT_TIMEOUT to wait for the process exit. (cherry picked from commit aaf297c048694cd9652790f8b74e69f7ddadfbde) Co-authored-by: Victor Stinner --- Lib/test/test_os.py | 50 ++++++++++++++++++++++++--------------------- 1 file changed, 27 insertions(+), 23 deletions(-) diff --git a/Lib/test/test_os.py b/Lib/test/test_os.py index 1e1980e3ac47e5..ff1121efdf74da 100644 --- a/Lib/test/test_os.py +++ b/Lib/test/test_os.py @@ -2505,30 +2505,34 @@ def _kill_with_event(self, event, name): tagname = "test_os_%s" % uuid.uuid1() m = mmap.mmap(-1, 1, tagname) m[0] = 0 + # Run a script which has console control handling enabled. - proc = subprocess.Popen([sys.executable, - os.path.join(os.path.dirname(__file__), - "win_console_handler.py"), tagname], - creationflags=subprocess.CREATE_NEW_PROCESS_GROUP) - # Let the interpreter startup before we send signals. See #3137. - count, max = 0, 100 - while count < max and proc.poll() is None: - if m[0] == 1: - break - time.sleep(0.1) - count += 1 - else: - # Forcefully kill the process if we weren't able to signal it. - os.kill(proc.pid, signal.SIGINT) - self.fail("Subprocess didn't finish initialization") - os.kill(proc.pid, event) - # proc.send_signal(event) could also be done here. - # Allow time for the signal to be passed and the process to exit. - time.sleep(0.5) - if not proc.poll(): - # Forcefully kill the process if we weren't able to signal it. - os.kill(proc.pid, signal.SIGINT) - self.fail("subprocess did not stop on {}".format(name)) + script = os.path.join(os.path.dirname(__file__), + "win_console_handler.py") + cmd = [sys.executable, script, tagname] + proc = subprocess.Popen(cmd, + creationflags=subprocess.CREATE_NEW_PROCESS_GROUP) + + with proc: + # Let the interpreter startup before we send signals. See #3137. + for _ in support.sleeping_retry(support.SHORT_TIMEOUT): + if proc.poll() is None: + break + else: + # Forcefully kill the process if we weren't able to signal it. + proc.kill() + self.fail("Subprocess didn't finish initialization") + + os.kill(proc.pid, event) + + try: + # proc.send_signal(event) could also be done here. + # Allow time for the signal to be passed and the process to exit. + proc.wait(timeout=support.SHORT_TIMEOUT) + except subprocess.TimeoutExpired: + # Forcefully kill the process if we weren't able to signal it. + proc.kill() + self.fail("subprocess did not stop on {}".format(name)) @unittest.skip("subprocesses aren't inheriting Ctrl+C property") @support.requires_subprocess() From f7a1d7d060168f668dbc985bd978d8b009d6e300 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 5 Oct 2023 18:10:39 -0700 Subject: [PATCH 028/265] [3.11] gh-103053: Fix make check-clean-src: check "python" program (GH-110449) (#110454) gh-103053: Fix make check-clean-src: check "python" program (GH-110449) "make check-clean-src" now also checks if the "python" program is found in the source directory: fail with an error if it does exist. (cherry picked from commit a155f9f3427578ca5706d27e20bd0576f0395073) Co-authored-by: Victor Stinner --- Makefile.pre.in | 3 ++- .../next/Build/2023-10-06-02-15-23.gh-issue-103053.--7JUF.rst | 3 +++ 2 files changed, 5 insertions(+), 1 deletion(-) create mode 100644 Misc/NEWS.d/next/Build/2023-10-06-02-15-23.gh-issue-103053.--7JUF.rst diff --git a/Makefile.pre.in b/Makefile.pre.in index f84ab130ddbd2e..1dd8189167aa21 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -592,7 +592,8 @@ build_wasm: check-clean-src $(BUILDPYTHON) platform oldsharedmods python-config # Check that the source is clean when building out of source. check-clean-src: @if test -n "$(VPATH)" -a \( \ - -f "$(srcdir)/Programs/python.o" \ + -f "$(srcdir)/$(BUILDPYTHON)" \ + -o -f "$(srcdir)/Programs/python.o" \ -o -f "$(srcdir)\Python/frozen_modules/importlib._bootstrap.h" \ \); then \ echo "Error: The source directory ($(srcdir)) is not clean" ; \ diff --git a/Misc/NEWS.d/next/Build/2023-10-06-02-15-23.gh-issue-103053.--7JUF.rst b/Misc/NEWS.d/next/Build/2023-10-06-02-15-23.gh-issue-103053.--7JUF.rst new file mode 100644 index 00000000000000..81aa21357287c7 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2023-10-06-02-15-23.gh-issue-103053.--7JUF.rst @@ -0,0 +1,3 @@ +"make check-clean-src" now also checks if the "python" program is found in +the source directory: fail with an error if it does exist. Patch by Victor +Stinner. From 4499e6cafff114cf7cfb360706f22c4b7a54390e Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 5 Oct 2023 18:32:21 -0700 Subject: [PATCH 029/265] [3.11] gh-103053: Fix test_tools.test_freeze on FreeBSD (GH-110451) (#110457) gh-103053: Fix test_tools.test_freeze on FreeBSD (GH-110451) Fix test_tools.test_freeze on FreeBSD: run "make distclean" instead of "make clean" in the copied source directory to remove also the "python" program. Other test_freeze changes: * Log executed commands and directories, and the current directory. * No longer uses make -C option to change the directory, instead use subprocess cwd parameter. (cherry picked from commit a4baa9e8ac62cac3ea6363b15ea585b1998ea1f9) Co-authored-by: Victor Stinner --- ...-10-06-02-32-18.gh-issue-103053.VfxBLI.rst | 3 ++ Tools/freeze/test/freeze.py | 33 +++++++++++-------- 2 files changed, 22 insertions(+), 14 deletions(-) create mode 100644 Misc/NEWS.d/next/Tests/2023-10-06-02-32-18.gh-issue-103053.VfxBLI.rst diff --git a/Misc/NEWS.d/next/Tests/2023-10-06-02-32-18.gh-issue-103053.VfxBLI.rst b/Misc/NEWS.d/next/Tests/2023-10-06-02-32-18.gh-issue-103053.VfxBLI.rst new file mode 100644 index 00000000000000..90a7ca512c97dd --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-10-06-02-32-18.gh-issue-103053.VfxBLI.rst @@ -0,0 +1,3 @@ +Fix test_tools.test_freeze on FreeBSD: run "make distclean" instead of "make +clean" in the copied source directory to remove also the "python" program. +Patch by Victor Stinner. diff --git a/Tools/freeze/test/freeze.py b/Tools/freeze/test/freeze.py index cdf77c57bbb6ae..d5f4d3d24c2bd6 100644 --- a/Tools/freeze/test/freeze.py +++ b/Tools/freeze/test/freeze.py @@ -27,8 +27,10 @@ class UnsupportedError(Exception): """The operation isn't supported.""" -def _run_quiet(cmd, cwd=None): - #print(f'# {" ".join(shlex.quote(a) for a in cmd)}') +def _run_quiet(cmd, *, cwd=None): + if cwd: + print('+', 'cd', cwd, flush=True) + print('+', shlex.join(cmd), flush=True) try: return subprocess.run( cmd, @@ -48,8 +50,8 @@ def _run_quiet(cmd, cwd=None): raise -def _run_stdout(cmd, cwd=None): - proc = _run_quiet(cmd, cwd) +def _run_stdout(cmd): + proc = _run_quiet(cmd) return proc.stdout.strip() @@ -91,13 +93,18 @@ def copy_source_tree(newroot, oldroot): shutil.copytree(oldroot, newroot, ignore=support.copy_python_src_ignore) if os.path.exists(os.path.join(newroot, 'Makefile')): - _run_quiet([MAKE, 'clean'], newroot) + # Out-of-tree builds require a clean srcdir. "make clean" keeps + # the "python" program, so use "make distclean" instead. + _run_quiet([MAKE, 'distclean'], cwd=newroot) ################################## # freezing def prepare(script=None, outdir=None): + print() + print("cwd:", os.getcwd()) + if not outdir: outdir = OUTDIR os.makedirs(outdir, exist_ok=True) @@ -125,7 +132,7 @@ def prepare(script=None, outdir=None): ensure_opt(cmd, 'cache-file', os.path.join(outdir, 'python-config.cache')) prefix = os.path.join(outdir, 'python-installation') ensure_opt(cmd, 'prefix', prefix) - _run_quiet(cmd, builddir) + _run_quiet(cmd, cwd=builddir) if not MAKE: raise UnsupportedError('make') @@ -135,20 +142,18 @@ def prepare(script=None, outdir=None): # this test is most often run as part of the whole suite with a lot # of other tests running in parallel, from 1-2 vCPU systems up to # people's NNN core beasts. Don't attempt to use it all. - parallel = f'-j{cores*2//3}' + jobs = cores * 2 // 3 + parallel = f'-j{jobs}' else: parallel = '-j2' # Build python. print(f'building python {parallel=} in {builddir}...') - if os.path.exists(os.path.join(srcdir, 'Makefile')): - # Out-of-tree builds require a clean srcdir. - _run_quiet([MAKE, '-C', srcdir, 'clean']) - _run_quiet([MAKE, '-C', builddir, parallel]) + _run_quiet([MAKE, parallel], cwd=builddir) # Install the build. print(f'installing python into {prefix}...') - _run_quiet([MAKE, '-C', builddir, 'install']) + _run_quiet([MAKE, 'install'], cwd=builddir) python = os.path.join(prefix, 'bin', 'python3') return outdir, scriptfile, python @@ -161,8 +166,8 @@ def freeze(python, scriptfile, outdir): print(f'freezing {scriptfile}...') os.makedirs(outdir, exist_ok=True) # Use -E to ignore PYTHONSAFEPATH - _run_quiet([python, '-E', FREEZE, '-o', outdir, scriptfile], outdir) - _run_quiet([MAKE, '-C', os.path.dirname(scriptfile)]) + _run_quiet([python, '-E', FREEZE, '-o', outdir, scriptfile], cwd=outdir) + _run_quiet([MAKE], cwd=os.path.dirname(scriptfile)) name = os.path.basename(scriptfile).rpartition('.')[0] executable = os.path.join(outdir, name) From 88223f15d792159bb2ab26923cb21bee910a4565 Mon Sep 17 00:00:00 2001 From: Victor Stinner Date: Fri, 6 Oct 2023 12:19:49 +0200 Subject: [PATCH 030/265] [3.11] Add support.MS_WINDOWS constant (#110446) (#110452) (#110464) [3.12] Add support.MS_WINDOWS constant (#110446) (#110452) Add support.MS_WINDOWS constant (#110446) (cherry picked from commit e0c44377935de3491b2cbe1e5f87f8b336fdc922) (cherry picked from commit e188534607761af1f8faba483ce0b1e8578c73e5) --- Lib/test/_test_embed_set_config.py | 2 +- Lib/test/pythoninfo.py | 13 +++++++++++++ Lib/test/support/__init__.py | 4 +++- Lib/test/test_asyncio/test_subprocess.py | 5 ++--- Lib/test/test_cppext/__init__.py | 5 ++--- Lib/test/test_cppext/setup.py | 6 ++---- Lib/test/test_embed.py | 4 +--- Lib/test/test_faulthandler.py | 4 +--- Lib/test/test_gdb/__init__.py | 3 +-- Lib/test/test_utf8_mode.py | 3 +-- 10 files changed, 27 insertions(+), 22 deletions(-) diff --git a/Lib/test/_test_embed_set_config.py b/Lib/test/_test_embed_set_config.py index 7ff641b37bf188..05e83a65fd4a35 100644 --- a/Lib/test/_test_embed_set_config.py +++ b/Lib/test/_test_embed_set_config.py @@ -9,9 +9,9 @@ import os import sys import unittest +from test.support import MS_WINDOWS -MS_WINDOWS = (os.name == 'nt') MAX_HASH_SEED = 4294967295 class SetConfigTests(unittest.TestCase): diff --git a/Lib/test/pythoninfo.py b/Lib/test/pythoninfo.py index db8dfe247cca25..3dbcac2f86e81e 100644 --- a/Lib/test/pythoninfo.py +++ b/Lib/test/pythoninfo.py @@ -713,6 +713,19 @@ def collect_test_support(info_add): attributes = ('IPV6_ENABLED',) copy_attributes(info_add, support, 'test_support.%s', attributes) + attributes = ( + 'MS_WINDOWS', + 'has_fork_support', + 'has_socket_support', + 'has_strftime_extensions', + 'has_subprocess_support', + 'is_android', + 'is_emscripten', + 'is_jython', + 'is_wasi', + ) + copy_attributes(info_add, support, 'support.%s', attributes) + call_func(info_add, 'test_support._is_gui_available', support, '_is_gui_available') call_func(info_add, 'test_support.python_is_optimized', support, 'python_is_optimized') diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index 9da0c8e5c355ba..1d3b0053bf86c9 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -48,7 +48,7 @@ "check__all__", "skip_if_buggy_ucrt_strfptime", "check_disallow_instantiation", "check_sanitizer", "skip_if_sanitizer", # sys - "is_jython", "is_android", "is_emscripten", "is_wasi", + "MS_WINDOWS", "is_jython", "is_android", "is_emscripten", "is_wasi", "check_impl_detail", "unix_shell", "setswitchinterval", # network "open_urlresource", @@ -501,6 +501,8 @@ def requires_debug_ranges(reason='requires co_positions / debug_ranges'): requires_legacy_unicode_capi = unittest.skipUnless(unicode_legacy_string, 'requires legacy Unicode C API') +MS_WINDOWS = (sys.platform == 'win32') + is_jython = sys.platform.startswith('java') is_android = hasattr(sys, 'getandroidapilevel') diff --git a/Lib/test/test_asyncio/test_subprocess.py b/Lib/test/test_asyncio/test_subprocess.py index 8b4f14eb48de65..f0d1d95d2c1ec0 100644 --- a/Lib/test/test_asyncio/test_subprocess.py +++ b/Lib/test/test_asyncio/test_subprocess.py @@ -15,8 +15,7 @@ from test.support import os_helper -MS_WINDOWS = (sys.platform == 'win32') -if MS_WINDOWS: +if support.MS_WINDOWS: import msvcrt else: from asyncio import unix_events @@ -266,7 +265,7 @@ def test_stdin_broken_pipe(self): rfd, wfd = os.pipe() self.addCleanup(os.close, rfd) self.addCleanup(os.close, wfd) - if MS_WINDOWS: + if support.MS_WINDOWS: handle = msvcrt.get_osfhandle(rfd) os.set_handle_inheritable(handle, True) code = textwrap.dedent(f''' diff --git a/Lib/test/test_cppext/__init__.py b/Lib/test/test_cppext/__init__.py index 37247e4ac01e71..b3999d011e997b 100644 --- a/Lib/test/test_cppext/__init__.py +++ b/Lib/test/test_cppext/__init__.py @@ -9,7 +9,6 @@ from test.support import os_helper -MS_WINDOWS = (sys.platform == 'win32') SETUP = os.path.join(os.path.dirname(__file__), 'setup.py') @@ -25,7 +24,7 @@ def test_build_cpp03(self): # With MSVC, the linker fails with: cannot open file 'python311.lib' # https://github.com/python/cpython/pull/32175#issuecomment-1111175897 - @unittest.skipIf(MS_WINDOWS, 'test fails on Windows') + @unittest.skipIf(support.MS_WINDOWS, 'test fails on Windows') # Building and running an extension in clang sanitizing mode is not # straightforward @unittest.skipIf( @@ -53,7 +52,7 @@ def _check_build(self, std_cpp03, extension_name): python_exe = 'python' if sys.executable.endswith('.exe'): python_exe += '.exe' - if MS_WINDOWS: + if support.MS_WINDOWS: python = os.path.join(venv_dir, 'Scripts', python_exe) else: python = os.path.join(venv_dir, 'bin', python_exe) diff --git a/Lib/test/test_cppext/setup.py b/Lib/test/test_cppext/setup.py index f74124775acd06..10f4ed21207fc5 100644 --- a/Lib/test/test_cppext/setup.py +++ b/Lib/test/test_cppext/setup.py @@ -4,15 +4,13 @@ import shlex import sys import sysconfig +from test import support from setuptools import setup, Extension -MS_WINDOWS = (sys.platform == 'win32') - - SOURCE = os.path.join(os.path.dirname(__file__), 'extension.cpp') -if not MS_WINDOWS: +if not support.MS_WINDOWS: # C++ compiler flags for GCC and clang CPPFLAGS = [ # gh-91321: The purpose of _testcppext extension is to check that building diff --git a/Lib/test/test_embed.py b/Lib/test/test_embed.py index 2affa17e800c07..1026dcd0acaab7 100644 --- a/Lib/test/test_embed.py +++ b/Lib/test/test_embed.py @@ -1,7 +1,6 @@ # Run the tests in Programs/_testembed.c (tests for the CPython embedding APIs) from test import support -from test.support import import_helper -from test.support import os_helper +from test.support import import_helper, os_helper, MS_WINDOWS import unittest from collections import namedtuple @@ -20,7 +19,6 @@ if not support.has_subprocess_support: raise unittest.SkipTest("test module requires subprocess") -MS_WINDOWS = (os.name == 'nt') MACOS = (sys.platform == 'darwin') Py_DEBUG = hasattr(sys, 'gettotalrefcount') PYMEM_ALLOCATOR_NOT_SET = 0 diff --git a/Lib/test/test_faulthandler.py b/Lib/test/test_faulthandler.py index 0e03ee2128891c..50b40d336e4dca 100644 --- a/Lib/test/test_faulthandler.py +++ b/Lib/test/test_faulthandler.py @@ -7,8 +7,7 @@ import subprocess import sys from test import support -from test.support import os_helper -from test.support import script_helper, is_android +from test.support import os_helper, script_helper, is_android, MS_WINDOWS from test.support import skip_if_sanitizer import tempfile import unittest @@ -23,7 +22,6 @@ raise unittest.SkipTest("test module requires subprocess") TIMEOUT = 0.5 -MS_WINDOWS = (os.name == 'nt') def expected_traceback(lineno1, lineno2, header, min_count=1): diff --git a/Lib/test/test_gdb/__init__.py b/Lib/test/test_gdb/__init__.py index d74075e456792d..99557739af6748 100644 --- a/Lib/test/test_gdb/__init__.py +++ b/Lib/test/test_gdb/__init__.py @@ -9,8 +9,7 @@ from test import support -MS_WINDOWS = (os.name == 'nt') -if MS_WINDOWS: +if support.MS_WINDOWS: # On Windows, Python is usually built by MSVC. Passing /p:DebugSymbols=true # option to MSBuild produces PDB debug symbols, but gdb doesn't support PDB # debug symbol files. diff --git a/Lib/test/test_utf8_mode.py b/Lib/test/test_utf8_mode.py index ec29ba6d51b127..f66881044e16df 100644 --- a/Lib/test/test_utf8_mode.py +++ b/Lib/test/test_utf8_mode.py @@ -9,10 +9,9 @@ import unittest from test import support from test.support.script_helper import assert_python_ok, assert_python_failure -from test.support import os_helper +from test.support import os_helper, MS_WINDOWS -MS_WINDOWS = (sys.platform == 'win32') POSIX_LOCALES = ('C', 'POSIX') VXWORKS = (sys.platform == "vxworks") From 49a45f6cd80605998eb50fd95ca4b5531674e290 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 6 Oct 2023 07:11:51 -0700 Subject: [PATCH 031/265] [3.11] gh-110184: Fix subprocess test_pipesize_default() (GH-110465) (#110472) gh-110184: Fix subprocess test_pipesize_default() (GH-110465) For proc.stdin, get the size of the read end of the test pipe. Use subprocess context manager ("with proc:"). (cherry picked from commit d023d4166b255023dac448305270350030101481) Co-authored-by: Victor Stinner --- Lib/test/test_subprocess.py | 41 +++++++++++++++++++++---------------- 1 file changed, 23 insertions(+), 18 deletions(-) diff --git a/Lib/test/test_subprocess.py b/Lib/test/test_subprocess.py index 3142026ac535b3..b7b16e22fa6419 100644 --- a/Lib/test/test_subprocess.py +++ b/Lib/test/test_subprocess.py @@ -747,31 +747,36 @@ def test_pipesizes(self): @unittest.skipUnless(fcntl and hasattr(fcntl, 'F_GETPIPE_SZ'), 'fcntl.F_GETPIPE_SZ required for test.') def test_pipesize_default(self): - p = subprocess.Popen( + proc = subprocess.Popen( [sys.executable, "-c", 'import sys; sys.stdin.read(); sys.stdout.write("out"); ' 'sys.stderr.write("error!")'], stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, pipesize=-1) - try: - fp_r, fp_w = os.pipe() + + with proc: try: - default_pipesize = fcntl.fcntl(fp_w, fcntl.F_GETPIPE_SZ) - for fifo in [p.stdin, p.stdout, p.stderr]: - self.assertEqual( - fcntl.fcntl(fifo.fileno(), fcntl.F_GETPIPE_SZ), - default_pipesize) + fp_r, fp_w = os.pipe() + try: + default_read_pipesize = fcntl.fcntl(fp_r, fcntl.F_GETPIPE_SZ) + default_write_pipesize = fcntl.fcntl(fp_w, fcntl.F_GETPIPE_SZ) + finally: + os.close(fp_r) + os.close(fp_w) + + self.assertEqual( + fcntl.fcntl(proc.stdin.fileno(), fcntl.F_GETPIPE_SZ), + default_read_pipesize) + self.assertEqual( + fcntl.fcntl(proc.stdout.fileno(), fcntl.F_GETPIPE_SZ), + default_write_pipesize) + self.assertEqual( + fcntl.fcntl(proc.stderr.fileno(), fcntl.F_GETPIPE_SZ), + default_write_pipesize) + # On other platforms we cannot test the pipe size (yet). But above + # code using pipesize=-1 should not crash. finally: - os.close(fp_r) - os.close(fp_w) - # On other platforms we cannot test the pipe size (yet). But above - # code using pipesize=-1 should not crash. - p.stdin.close() - p.stdout.close() - p.stderr.close() - finally: - p.kill() - p.wait() + proc.kill() def test_env(self): newenv = os.environ.copy() From 3d5aa7ec6129330c10e76435b627ff1df2cb08f7 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 6 Oct 2023 07:35:23 -0700 Subject: [PATCH 032/265] [3.11] Fix typo in Doc/library/textwrap.rst (GH-110328) (#110474) Co-authored-by: InSync <122007197+InSyncWithFoo@users.noreply.github.com> --- Doc/library/textwrap.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/library/textwrap.rst b/Doc/library/textwrap.rst index e2952ce3cc2ca3..7445410f91808c 100644 --- a/Doc/library/textwrap.rst +++ b/Doc/library/textwrap.rst @@ -238,7 +238,7 @@ hyphenated words; only then will long words be broken if necessary, unless However, the sentence detection algorithm is imperfect: it assumes that a sentence ending consists of a lowercase letter followed by one of ``'.'``, ``'!'``, or ``'?'``, possibly followed by one of ``'"'`` or ``"'"``, - followed by a space. One problem with this is algorithm is that it is + followed by a space. One problem with this algorithm is that it is unable to detect the difference between "Dr." in :: [...] Dr. Frankenstein's monster [...] From 6a33529cf0bcfe9001af58e45ea8feab4daed265 Mon Sep 17 00:00:00 2001 From: Serhiy Storchaka Date: Sat, 7 Oct 2023 16:05:13 +0300 Subject: [PATCH 033/265] [3.11] gh-109521: Fix obscure cases handling in PyImport_GetImporter() (GH-109522) (GH-109781) PyImport_GetImporter() now sets RuntimeError if it fails to get sys.path_hooks or sys.path_importer_cache or they are not list and dict correspondingly. Previously it could return NULL without setting error in obscure cases, crash or raise SystemError if these attributes have wrong type. (cherry picked from commit 62c7015e89cbdedb5218d4fedd45f971885f67a8) --- ...2023-09-17-21-47-31.gh-issue-109521.JDF6i9.rst | 5 +++++ Python/import.c | 15 +++++++++++++-- 2 files changed, 18 insertions(+), 2 deletions(-) create mode 100644 Misc/NEWS.d/next/C API/2023-09-17-21-47-31.gh-issue-109521.JDF6i9.rst diff --git a/Misc/NEWS.d/next/C API/2023-09-17-21-47-31.gh-issue-109521.JDF6i9.rst b/Misc/NEWS.d/next/C API/2023-09-17-21-47-31.gh-issue-109521.JDF6i9.rst new file mode 100644 index 00000000000000..338650c9246686 --- /dev/null +++ b/Misc/NEWS.d/next/C API/2023-09-17-21-47-31.gh-issue-109521.JDF6i9.rst @@ -0,0 +1,5 @@ +:c:func:`PyImport_GetImporter` now sets RuntimeError if it fails to get +:data:`sys.path_hooks` or :data:`sys.path_importer_cache` or they are not +list and dict correspondingly. Previously it could return NULL without +setting error in obscure cases, crash or raise SystemError if these +attributes have wrong type. diff --git a/Python/import.c b/Python/import.c index 38c23e02cec6a2..39144d302193ac 100644 --- a/Python/import.c +++ b/Python/import.c @@ -951,11 +951,22 @@ PyImport_GetImporter(PyObject *path) { PyThreadState *tstate = _PyThreadState_GET(); PyObject *path_importer_cache = PySys_GetObject("path_importer_cache"); + if (path_importer_cache == NULL) { + PyErr_SetString(PyExc_RuntimeError, "lost sys.path_importer_cache"); + return NULL; + } + Py_INCREF(path_importer_cache); PyObject *path_hooks = PySys_GetObject("path_hooks"); - if (path_importer_cache == NULL || path_hooks == NULL) { + if (path_hooks == NULL) { + PyErr_SetString(PyExc_RuntimeError, "lost sys.path_hooks"); + Py_DECREF(path_importer_cache); return NULL; } - return get_path_importer(tstate, path_importer_cache, path_hooks, path); + Py_INCREF(path_hooks); + PyObject *importer = get_path_importer(tstate, path_importer_cache, path_hooks, path); + Py_DECREF(path_hooks); + Py_DECREF(path_importer_cache); + return importer; } #if defined(__EMSCRIPTEN__) && defined(PY_CALL_TRAMPOLINE) From 682321292f0f342a88311495418ef6f84e4eb452 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 7 Oct 2023 06:23:54 -0700 Subject: [PATCH 034/265] [3.11] gh-109864: Make test_gettext tests order independent (GH-109866) (GH-110503) (cherry picked from commit 1aad4fc5dba993899621de86ae5955883448d6f6) Co-authored-by: Serhiy Storchaka --- Lib/test/test_gettext.py | 15 +++++++++++++-- 1 file changed, 13 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_gettext.py b/Lib/test/test_gettext.py index 8430fc234d00ee..2650ce501c309d 100644 --- a/Lib/test/test_gettext.py +++ b/Lib/test/test_gettext.py @@ -115,6 +115,12 @@ MMOFILE = os.path.join(LOCALEDIR, 'metadata.mo') +def reset_gettext(): + gettext._localedirs.clear() + gettext._current_domain = 'messages' + gettext._translations.clear() + + class GettextBaseTest(unittest.TestCase): def setUp(self): self.addCleanup(os_helper.rmtree, os.path.split(LOCALEDIR)[0]) @@ -132,7 +138,8 @@ def setUp(self): fp.write(base64.decodebytes(MMO_DATA)) self.env = self.enterContext(os_helper.EnvironmentVarGuard()) self.env['LANGUAGE'] = 'xx' - gettext._translations.clear() + reset_gettext() + self.addCleanup(reset_gettext) GNU_MO_DATA_ISSUE_17898 = b'''\ @@ -312,6 +319,10 @@ def test_multiline_strings(self): class PluralFormsTestCase(GettextBaseTest): def setUp(self): GettextBaseTest.setUp(self) + self.localedir = os.curdir + # Set up the bindings + gettext.bindtextdomain('gettext', self.localedir) + gettext.textdomain('gettext') self.mofile = MOFILE def test_plural_forms1(self): @@ -355,7 +366,7 @@ def test_plural_context_forms2(self): x = t.npgettext('With context', 'There is %s file', 'There are %s files', 2) eq(x, 'Hay %s ficheros (context)') - x = gettext.pgettext('With context', 'There is %s file') + x = t.pgettext('With context', 'There is %s file') eq(x, 'Hay %s fichero (context)') # Examples from http://www.gnu.org/software/gettext/manual/gettext.html From af168df78f260a809875b309f35fbf38b58c1fec Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 7 Oct 2023 06:24:27 -0700 Subject: [PATCH 035/265] [3.11] gh-109848: Make test_rot13_func in test_codecs independent (GH-109850) (GH-110505) (cherry picked from commit b987fdb19b981ef6e7f71b41790b5ed4e2064646) Co-authored-by: Serhiy Storchaka --- Lib/test/test_codecs.py | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/Lib/test/test_codecs.py b/Lib/test/test_codecs.py index 684e6cf7515481..a7440eea67c101 100644 --- a/Lib/test/test_codecs.py +++ b/Lib/test/test_codecs.py @@ -3590,9 +3590,10 @@ class Rot13UtilTest(unittest.TestCase): $ echo "Hello World" | python -m encodings.rot_13 """ def test_rot13_func(self): + from encodings.rot_13 import rot13 infile = io.StringIO('Gb or, be abg gb or, gung vf gur dhrfgvba') outfile = io.StringIO() - encodings.rot_13.rot13(infile, outfile) + rot13(infile, outfile) outfile.seek(0) plain_text = outfile.read() self.assertEqual( From f21c09ca03b39319ee6de54f5d706bd6e2313985 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 7 Oct 2023 17:29:46 -0700 Subject: [PATCH 036/265] [3.11] gh-110237: Check `PyList_Append` for errors in `_PyEval_MatchClass` (GH-110238) (#110512) gh-110237: Check `PyList_Append` for errors in `_PyEval_MatchClass` (GH-110238) (cherry picked from commit dd9d781da30aa3740e54c063a40413c542d78c25) Co-authored-by: denballakh <47365157+denballakh@users.noreply.github.com> --- .../2023-10-02-23-17-08.gh-issue-110237._Xub0z.rst | 1 + Python/ceval.c | 14 +++++++++++--- 2 files changed, 12 insertions(+), 3 deletions(-) create mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-02-23-17-08.gh-issue-110237._Xub0z.rst diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-02-23-17-08.gh-issue-110237._Xub0z.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-02-23-17-08.gh-issue-110237._Xub0z.rst new file mode 100644 index 00000000000000..67b95c52f7e4da --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2023-10-02-23-17-08.gh-issue-110237._Xub0z.rst @@ -0,0 +1 @@ +Fix missing error checks for calls to ``PyList_Append`` in ``_PyEval_MatchClass``. diff --git a/Python/ceval.c b/Python/ceval.c index df11de084d2d72..172bc4743bda90 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -1060,7 +1060,9 @@ match_class(PyThreadState *tstate, PyObject *subject, PyObject *type, } if (match_self) { // Easy. Copy the subject itself, and move on to kwargs. - PyList_Append(attrs, subject); + if (PyList_Append(attrs, subject) < 0) { + goto fail; + } } else { for (Py_ssize_t i = 0; i < nargs; i++) { @@ -1076,7 +1078,10 @@ match_class(PyThreadState *tstate, PyObject *subject, PyObject *type, if (attr == NULL) { goto fail; } - PyList_Append(attrs, attr); + if (PyList_Append(attrs, attr) < 0) { + Py_DECREF(attr); + goto fail; + } Py_DECREF(attr); } } @@ -1089,7 +1094,10 @@ match_class(PyThreadState *tstate, PyObject *subject, PyObject *type, if (attr == NULL) { goto fail; } - PyList_Append(attrs, attr); + if (PyList_Append(attrs, attr) < 0) { + Py_DECREF(attr); + goto fail; + } Py_DECREF(attr); } Py_SETREF(attrs, PyList_AsTuple(attrs)); From 2bad6e715ad74da296c8081bd334ee20a68b77ae Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 8 Oct 2023 21:55:37 -0700 Subject: [PATCH 037/265] [3.11] gh-110534 fix a URL redirect to wikipedia article on Fibonacci numbers (GH-110535) (#110537) gh-110534 fix a URL redirect to wikipedia article on Fibonacci numbers (GH-110535) (cherry picked from commit 892ee72b3622de30acd12576b59259fc69e2e40a) Co-authored-by: partev --- Doc/tutorial/introduction.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/tutorial/introduction.rst b/Doc/tutorial/introduction.rst index 24ae720fe2bb62..b3b8e423a09fa1 100644 --- a/Doc/tutorial/introduction.rst +++ b/Doc/tutorial/introduction.rst @@ -480,7 +480,7 @@ First Steps Towards Programming Of course, we can use Python for more complicated tasks than adding two and two together. For instance, we can write an initial sub-sequence of the -`Fibonacci series `_ +`Fibonacci series `_ as follows:: >>> # Fibonacci series: From 6b63d40919f080998c158546c18b89877093de8b Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 9 Oct 2023 11:42:01 +0200 Subject: [PATCH 038/265] [3.11] gh-110497: Add note about `OSError` being an alias to `IOError` in docs (GH-110498) (#110545) gh-110497: Add note about `OSError` being an alias to `IOError` in docs (GH-110498) (cherry picked from commit 5e7edac7717bfe5f3c533d83ddd0f564db8de40b) Co-authored-by: Nikita Sobolev --- Doc/library/ctypes.rst | 8 +++++--- Doc/library/gettext.rst | 2 +- Doc/library/http.cookiejar.rst | 4 ++-- Doc/library/urllib.error.rst | 4 ++-- Doc/library/zipimport.rst | 2 +- 5 files changed, 11 insertions(+), 9 deletions(-) diff --git a/Doc/library/ctypes.rst b/Doc/library/ctypes.rst index e2dde683a1a5d3..51cffc029017b2 100644 --- a/Doc/library/ctypes.rst +++ b/Doc/library/ctypes.rst @@ -1374,7 +1374,8 @@ way is to instantiate one of the following classes: failure, an :class:`OSError` is automatically raised. .. versionchanged:: 3.3 - :exc:`WindowsError` used to be raised. + :exc:`WindowsError` used to be raised, + which is now an alias of :exc:`OSError`. .. class:: WinDLL(name, mode=DEFAULT_MODE, handle=None, use_errno=False, use_last_error=False, winmode=None) @@ -2047,13 +2048,14 @@ Utility functions .. function:: WinError(code=None, descr=None) Windows only: this function is probably the worst-named thing in ctypes. It - creates an instance of OSError. If *code* is not specified, + creates an instance of :exc:`OSError`. If *code* is not specified, ``GetLastError`` is called to determine the error code. If *descr* is not specified, :func:`FormatError` is called to get a textual description of the error. .. versionchanged:: 3.3 - An instance of :exc:`WindowsError` used to be created. + An instance of :exc:`WindowsError` used to be created, which is now an + alias of :exc:`OSError`. .. function:: wstring_at(address, size=-1) diff --git a/Doc/library/gettext.rst b/Doc/library/gettext.rst index 7ebe91b372d35a..dc6cf5533fccbe 100644 --- a/Doc/library/gettext.rst +++ b/Doc/library/gettext.rst @@ -167,7 +167,7 @@ install themselves in the built-in namespace as the function :func:`!_`. :class:`NullTranslations` instance if *fallback* is true. .. versionchanged:: 3.3 - :exc:`IOError` used to be raised instead of :exc:`OSError`. + :exc:`IOError` used to be raised, it is now an alias of :exc:`OSError`. .. versionchanged:: 3.11 *codeset* parameter is removed. diff --git a/Doc/library/http.cookiejar.rst b/Doc/library/http.cookiejar.rst index 87ef156a0bed57..12a6d768437ea5 100644 --- a/Doc/library/http.cookiejar.rst +++ b/Doc/library/http.cookiejar.rst @@ -44,8 +44,8 @@ The module defines the following exception: cookies from a file. :exc:`LoadError` is a subclass of :exc:`OSError`. .. versionchanged:: 3.3 - LoadError was made a subclass of :exc:`OSError` instead of - :exc:`IOError`. + :exc:`LoadError` used to be a subtype of :exc:`IOError`, which is now an + alias of :exc:`OSError`. The following classes are provided: diff --git a/Doc/library/urllib.error.rst b/Doc/library/urllib.error.rst index 8772e8600795eb..8af07c5ec0474c 100644 --- a/Doc/library/urllib.error.rst +++ b/Doc/library/urllib.error.rst @@ -27,8 +27,8 @@ The following exceptions are raised by :mod:`urllib.error` as appropriate: exception instance. .. versionchanged:: 3.3 - :exc:`URLError` has been made a subclass of :exc:`OSError` instead - of :exc:`IOError`. + :exc:`URLError` used to be a subtype of :exc:`IOError`, which is now an + alias of :exc:`OSError`. .. exception:: HTTPError diff --git a/Doc/library/zipimport.rst b/Doc/library/zipimport.rst index fe1adcae163c23..71647445ed37ef 100644 --- a/Doc/library/zipimport.rst +++ b/Doc/library/zipimport.rst @@ -130,7 +130,7 @@ zipimporter Objects file wasn't found. .. versionchanged:: 3.3 - :exc:`IOError` used to be raised instead of :exc:`OSError`. + :exc:`IOError` used to be raised, it is now an alias of :exc:`OSError`. .. method:: get_filename(fullname) From b473d48505a339225bbbfbf7199ec28c65c0f468 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 9 Oct 2023 12:31:27 +0200 Subject: [PATCH 039/265] [3.11] gh-109286: Update macOS installer to use SQLite 3.43.1 (GH-110482) (#110551) (cherry picked from commit 48419a50b44a195ad7de958f479a924e7c2d3e1b) Co-authored-by: jtranquilli <76231120+jtranquilli@users.noreply.github.com> --- Mac/BuildScript/build-installer.py | 6 +++--- .../macOS/2023-10-04-23-38-24.gh-issue-109286.1ZLMaq.rst | 1 + 2 files changed, 4 insertions(+), 3 deletions(-) create mode 100644 Misc/NEWS.d/next/macOS/2023-10-04-23-38-24.gh-issue-109286.1ZLMaq.rst diff --git a/Mac/BuildScript/build-installer.py b/Mac/BuildScript/build-installer.py index 980160dd3077d7..553926852a7ca8 100755 --- a/Mac/BuildScript/build-installer.py +++ b/Mac/BuildScript/build-installer.py @@ -358,9 +358,9 @@ def library_recipes(): ), ), dict( - name="SQLite 3.42.0", - url="https://sqlite.org/2023/sqlite-autoconf-3420000.tar.gz", - checksum="0c5a92bc51cf07cae45b4a1e94653dea", + name="SQLite 3.43.1", + url="https://sqlite.org/2023/sqlite-autoconf-3430100.tar.gz", + checksum="77e61befe9c3298da0504f87772a24b0", extra_cflags=('-Os ' '-DSQLITE_ENABLE_FTS5 ' '-DSQLITE_ENABLE_FTS4 ' diff --git a/Misc/NEWS.d/next/macOS/2023-10-04-23-38-24.gh-issue-109286.1ZLMaq.rst b/Misc/NEWS.d/next/macOS/2023-10-04-23-38-24.gh-issue-109286.1ZLMaq.rst new file mode 100644 index 00000000000000..18ac9df73deb37 --- /dev/null +++ b/Misc/NEWS.d/next/macOS/2023-10-04-23-38-24.gh-issue-109286.1ZLMaq.rst @@ -0,0 +1 @@ +Update macOS installer to use SQLite 3.43.1. From 26c3e700b15b8f2845d29b2872a0a5854b349730 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 9 Oct 2023 13:42:57 +0200 Subject: [PATCH 040/265] [3.11] gh-110437: Allow overriding VCRuntimeDLL with a semicolon separated list of DLLs to bundle (GH-110470) gh-110437: Allow overriding VCRuntimeDLL with a semicolon separated list of DLLs to bundle (GH-110470) (cherry picked from commit 12cc6792d0ca1d0b72712d77c6efcb0aa0c7e7ba) Co-authored-by: Steve Dower --- .../Windows/2023-10-06-14-20-14.gh-issue-110437.xpYy9q.rst | 2 ++ PCbuild/pyproject.props | 5 ++++- 2 files changed, 6 insertions(+), 1 deletion(-) create mode 100644 Misc/NEWS.d/next/Windows/2023-10-06-14-20-14.gh-issue-110437.xpYy9q.rst diff --git a/Misc/NEWS.d/next/Windows/2023-10-06-14-20-14.gh-issue-110437.xpYy9q.rst b/Misc/NEWS.d/next/Windows/2023-10-06-14-20-14.gh-issue-110437.xpYy9q.rst new file mode 100644 index 00000000000000..777b4942e183f5 --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2023-10-06-14-20-14.gh-issue-110437.xpYy9q.rst @@ -0,0 +1,2 @@ +Allows overriding the source of VC redistributables so that releases can be +guaranteed to never downgrade between updates. diff --git a/PCbuild/pyproject.props b/PCbuild/pyproject.props index 199bf72b0d578c..a20967205d21eb 100644 --- a/PCbuild/pyproject.props +++ b/PCbuild/pyproject.props @@ -229,7 +229,10 @@ public override bool Execute() { - + + + + From e913c6f4d6134ad04ecd59c0408d2b13ff44e342 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 9 Oct 2023 16:10:34 +0200 Subject: [PATCH 041/265] [3.11] gh-110519: Improve deprecation warning in the gettext module (GH-110520) (GH-110564) Deprecation warning about non-integer numbers in gettext now always refers to the line in the user code where gettext function or method is used. Previously, it could refer to a line in gettext code. Also, increase test coverage for NullTranslations and domain-aware functions like dngettext(). (cherry picked from commit 326c6c4e07137b43c49b74bd5528619360080469) Co-authored-by: Serhiy Storchaka --- Lib/gettext.py | 14 +- Lib/test/test_gettext.py | 178 +++++++++++++----- ...-10-08-18-15-02.gh-issue-110519.RDGe8-.rst | 3 + 3 files changed, 144 insertions(+), 51 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-10-08-18-15-02.gh-issue-110519.RDGe8-.rst diff --git a/Lib/gettext.py b/Lib/gettext.py index b72b15f82d4355..e84765bfdf0649 100644 --- a/Lib/gettext.py +++ b/Lib/gettext.py @@ -46,6 +46,7 @@ # find this format documented anywhere. +import operator import os import re import sys @@ -166,14 +167,21 @@ def _parse(tokens, priority=-1): def _as_int(n): try: - i = round(n) + round(n) except TypeError: raise TypeError('Plural value must be an integer, got %s' % (n.__class__.__name__,)) from None + import warnings + frame = sys._getframe(1) + stacklevel = 2 + while frame.f_back is not None and frame.f_globals.get('__name__') == __name__: + stacklevel += 1 + frame = frame.f_back warnings.warn('Plural value must be an integer, got %s' % (n.__class__.__name__,), - DeprecationWarning, 4) + DeprecationWarning, + stacklevel) return n @@ -200,7 +208,7 @@ def c2py(plural): elif c == ')': depth -= 1 - ns = {'_as_int': _as_int} + ns = {'_as_int': _as_int, '__name__': __name__} exec('''if True: def func(n): if not isinstance(n, int): diff --git a/Lib/test/test_gettext.py b/Lib/test/test_gettext.py index 2650ce501c309d..dd33b9b88f6768 100644 --- a/Lib/test/test_gettext.py +++ b/Lib/test/test_gettext.py @@ -2,6 +2,7 @@ import base64 import gettext import unittest +from functools import partial from test import support from test.support import os_helper @@ -122,8 +123,9 @@ def reset_gettext(): class GettextBaseTest(unittest.TestCase): - def setUp(self): - self.addCleanup(os_helper.rmtree, os.path.split(LOCALEDIR)[0]) + @classmethod + def setUpClass(cls): + cls.addClassCleanup(os_helper.rmtree, os.path.split(LOCALEDIR)[0]) if not os.path.isdir(LOCALEDIR): os.makedirs(LOCALEDIR) with open(MOFILE, 'wb') as fp: @@ -136,6 +138,8 @@ def setUp(self): fp.write(base64.decodebytes(UMO_DATA)) with open(MMOFILE, 'wb') as fp: fp.write(base64.decodebytes(MMO_DATA)) + + def setUp(self): self.env = self.enterContext(os_helper.EnvironmentVarGuard()) self.env['LANGUAGE'] = 'xx' reset_gettext() @@ -316,59 +320,137 @@ def test_multiline_strings(self): trggrkg zrffntr pngnybt yvoenel.''') -class PluralFormsTestCase(GettextBaseTest): +class PluralFormsTests: + + def _test_plural_forms(self, ngettext, gettext, + singular, plural, tsingular, tplural, + numbers_only=True): + x = ngettext(singular, plural, 1) + self.assertEqual(x, tsingular) + x = ngettext(singular, plural, 2) + self.assertEqual(x, tplural) + x = gettext(singular) + self.assertEqual(x, tsingular) + + if numbers_only: + lineno = self._test_plural_forms.__code__.co_firstlineno + 9 + with self.assertWarns(DeprecationWarning) as cm: + x = ngettext(singular, plural, 1.0) + self.assertEqual(cm.filename, __file__) + self.assertEqual(cm.lineno, lineno + 4) + self.assertEqual(x, tsingular) + with self.assertWarns(DeprecationWarning) as cm: + x = ngettext(singular, plural, 1.1) + self.assertEqual(cm.filename, __file__) + self.assertEqual(cm.lineno, lineno + 9) + self.assertEqual(x, tplural) + with self.assertRaises(TypeError): + ngettext(singular, plural, None) + else: + x = ngettext(singular, plural, None) + self.assertEqual(x, tplural) + + def test_plural_forms(self): + self._test_plural_forms( + self.ngettext, self.gettext, + 'There is %s file', 'There are %s files', + 'Hay %s fichero', 'Hay %s ficheros') + self._test_plural_forms( + self.ngettext, self.gettext, + '%d file deleted', '%d files deleted', + '%d file deleted', '%d files deleted') + + def test_plural_context_forms(self): + ngettext = partial(self.npgettext, 'With context') + gettext = partial(self.pgettext, 'With context') + self._test_plural_forms( + ngettext, gettext, + 'There is %s file', 'There are %s files', + 'Hay %s fichero (context)', 'Hay %s ficheros (context)') + self._test_plural_forms( + ngettext, gettext, + '%d file deleted', '%d files deleted', + '%d file deleted', '%d files deleted') + + def test_plural_wrong_context_forms(self): + self._test_plural_forms( + partial(self.npgettext, 'Unknown context'), + partial(self.pgettext, 'Unknown context'), + 'There is %s file', 'There are %s files', + 'There is %s file', 'There are %s files') + + +class GNUTranslationsPluralFormsTestCase(PluralFormsTests, GettextBaseTest): def setUp(self): GettextBaseTest.setUp(self) - self.localedir = os.curdir # Set up the bindings - gettext.bindtextdomain('gettext', self.localedir) + gettext.bindtextdomain('gettext', os.curdir) gettext.textdomain('gettext') - self.mofile = MOFILE - def test_plural_forms1(self): - eq = self.assertEqual - x = gettext.ngettext('There is %s file', 'There are %s files', 1) - eq(x, 'Hay %s fichero') - x = gettext.ngettext('There is %s file', 'There are %s files', 2) - eq(x, 'Hay %s ficheros') - x = gettext.gettext('There is %s file') - eq(x, 'Hay %s fichero') - - def test_plural_context_forms1(self): - eq = self.assertEqual - x = gettext.npgettext('With context', - 'There is %s file', 'There are %s files', 1) - eq(x, 'Hay %s fichero (context)') - x = gettext.npgettext('With context', - 'There is %s file', 'There are %s files', 2) - eq(x, 'Hay %s ficheros (context)') - x = gettext.pgettext('With context', 'There is %s file') - eq(x, 'Hay %s fichero (context)') - - def test_plural_forms2(self): - eq = self.assertEqual - with open(self.mofile, 'rb') as fp: - t = gettext.GNUTranslations(fp) - x = t.ngettext('There is %s file', 'There are %s files', 1) - eq(x, 'Hay %s fichero') - x = t.ngettext('There is %s file', 'There are %s files', 2) - eq(x, 'Hay %s ficheros') - x = t.gettext('There is %s file') - eq(x, 'Hay %s fichero') - - def test_plural_context_forms2(self): - eq = self.assertEqual - with open(self.mofile, 'rb') as fp: + self.gettext = gettext.gettext + self.ngettext = gettext.ngettext + self.pgettext = gettext.pgettext + self.npgettext = gettext.npgettext + + +class GNUTranslationsWithDomainPluralFormsTestCase(PluralFormsTests, GettextBaseTest): + def setUp(self): + GettextBaseTest.setUp(self) + # Set up the bindings + gettext.bindtextdomain('gettext', os.curdir) + + self.gettext = partial(gettext.dgettext, 'gettext') + self.ngettext = partial(gettext.dngettext, 'gettext') + self.pgettext = partial(gettext.dpgettext, 'gettext') + self.npgettext = partial(gettext.dnpgettext, 'gettext') + + def test_plural_forms_wrong_domain(self): + self._test_plural_forms( + partial(gettext.dngettext, 'unknown'), + partial(gettext.dgettext, 'unknown'), + 'There is %s file', 'There are %s files', + 'There is %s file', 'There are %s files', + numbers_only=False) + + def test_plural_context_forms_wrong_domain(self): + self._test_plural_forms( + partial(gettext.dnpgettext, 'unknown', 'With context'), + partial(gettext.dpgettext, 'unknown', 'With context'), + 'There is %s file', 'There are %s files', + 'There is %s file', 'There are %s files', + numbers_only=False) + + +class GNUTranslationsClassPluralFormsTestCase(PluralFormsTests, GettextBaseTest): + def setUp(self): + GettextBaseTest.setUp(self) + with open(MOFILE, 'rb') as fp: t = gettext.GNUTranslations(fp) - x = t.npgettext('With context', - 'There is %s file', 'There are %s files', 1) - eq(x, 'Hay %s fichero (context)') - x = t.npgettext('With context', - 'There is %s file', 'There are %s files', 2) - eq(x, 'Hay %s ficheros (context)') - x = t.pgettext('With context', 'There is %s file') - eq(x, 'Hay %s fichero (context)') + self.gettext = t.gettext + self.ngettext = t.ngettext + self.pgettext = t.pgettext + self.npgettext = t.npgettext + + def test_plural_forms_null_translations(self): + t = gettext.NullTranslations() + self._test_plural_forms( + t.ngettext, t.gettext, + 'There is %s file', 'There are %s files', + 'There is %s file', 'There are %s files', + numbers_only=False) + + def test_plural_context_forms_null_translations(self): + t = gettext.NullTranslations() + self._test_plural_forms( + partial(t.npgettext, 'With context'), + partial(t.pgettext, 'With context'), + 'There is %s file', 'There are %s files', + 'There is %s file', 'There are %s files', + numbers_only=False) + + +class PluralFormsInternalTestCase: # Examples from http://www.gnu.org/software/gettext/manual/gettext.html def test_ja(self): diff --git a/Misc/NEWS.d/next/Library/2023-10-08-18-15-02.gh-issue-110519.RDGe8-.rst b/Misc/NEWS.d/next/Library/2023-10-08-18-15-02.gh-issue-110519.RDGe8-.rst new file mode 100644 index 00000000000000..8ff916736584dc --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-10-08-18-15-02.gh-issue-110519.RDGe8-.rst @@ -0,0 +1,3 @@ +Deprecation warning about non-integer number in :mod:`gettext` now alwais +refers to the line in the user code where gettext function or method is +used. Previously it could refer to a line in ``gettext`` code. From 9e9df93ffc6df5141843caf651d33d446676a414 Mon Sep 17 00:00:00 2001 From: Bo Anderson Date: Mon, 9 Oct 2023 20:42:25 +0100 Subject: [PATCH 042/265] [3.11] gh-109191: Fix build with newer editline (gh-110239) (#110575) (cherry picked from commit f4cb0d27cc08f490c42a22e646eb73cc7072d54a) --- ...23-10-05-11-46-20.gh-issue-109191.imUkVN.rst | 1 + Modules/readline.c | 2 +- configure | 17 +++++++++++++++++ configure.ac | 14 ++++++++++++++ pyconfig.h.in | 3 +++ 5 files changed, 36 insertions(+), 1 deletion(-) create mode 100644 Misc/NEWS.d/next/Build/2023-10-05-11-46-20.gh-issue-109191.imUkVN.rst diff --git a/Misc/NEWS.d/next/Build/2023-10-05-11-46-20.gh-issue-109191.imUkVN.rst b/Misc/NEWS.d/next/Build/2023-10-05-11-46-20.gh-issue-109191.imUkVN.rst new file mode 100644 index 00000000000000..27e5df790bc0c6 --- /dev/null +++ b/Misc/NEWS.d/next/Build/2023-10-05-11-46-20.gh-issue-109191.imUkVN.rst @@ -0,0 +1 @@ +Fix compile error when building with recent versions of libedit. diff --git a/Modules/readline.c b/Modules/readline.c index 27b89de7279464..8c7f526d418f82 100644 --- a/Modules/readline.c +++ b/Modules/readline.c @@ -440,7 +440,7 @@ readline_set_completion_display_matches_hook_impl(PyObject *module, default completion display. */ rl_completion_display_matches_hook = readlinestate_global->completion_display_matches_hook ? -#if defined(_RL_FUNCTION_TYPEDEF) +#if defined(HAVE_RL_COMPDISP_FUNC_T) (rl_compdisp_func_t *)on_completion_display_matches_hook : 0; #else (VFunction *)on_completion_display_matches_hook : 0; diff --git a/configure b/configure index af4a5bbfdfa1a4..b294f93a5564b4 100755 --- a/configure +++ b/configure @@ -21289,6 +21289,23 @@ if eval test \"x\$"$as_ac_Lib"\" = x"yes"; then : $as_echo "#define HAVE_RL_APPEND_HISTORY 1" >>confdefs.h +fi + + + # in readline as well as newer editline (April 2023) + ac_fn_c_check_type "$LINENO" "rl_compdisp_func_t" "ac_cv_type_rl_compdisp_func_t" " +#include /* Must be first for Gnu Readline */ +#ifdef WITH_EDITLINE +# include +#else +# include +#endif + +" +if test "x$ac_cv_type_rl_compdisp_func_t" = xyes; then : + +$as_echo "#define HAVE_RL_COMPDISP_FUNC_T 1" >>confdefs.h + fi fi diff --git a/configure.ac b/configure.ac index e1cbb7c7fbe9d9..629b7b76c3c315 100644 --- a/configure.ac +++ b/configure.ac @@ -5918,6 +5918,20 @@ if test "$py_cv_lib_readline" = yes; then AC_CHECK_LIB($LIBREADLINE, append_history, AC_DEFINE(HAVE_RL_APPEND_HISTORY, 1, [Define if readline supports append_history]),,$READLINE_LIBS) + + # in readline as well as newer editline (April 2023) + AC_CHECK_TYPE([rl_compdisp_func_t], + [AC_DEFINE([HAVE_RL_COMPDISP_FUNC_T], [1], + [Define if readline supports rl_compdisp_func_t])], + [], + [ +#include /* Must be first for Gnu Readline */ +#ifdef WITH_EDITLINE +# include +#else +# include +#endif + ]) fi # End of readline checks: restore LIBS diff --git a/pyconfig.h.in b/pyconfig.h.in index 0536047f573ce6..94d02e14c444ac 100644 --- a/pyconfig.h.in +++ b/pyconfig.h.in @@ -968,6 +968,9 @@ /* Define if you can turn off readline's signal handling. */ #undef HAVE_RL_CATCH_SIGNAL +/* Define if readline supports rl_compdisp_func_t */ +#undef HAVE_RL_COMPDISP_FUNC_T + /* Define if you have readline 2.2 */ #undef HAVE_RL_COMPLETION_APPEND_CHARACTER From 5ada51cd511d2e841ca37a1f5ea6b3245e5bf051 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 10 Oct 2023 03:13:34 +0200 Subject: [PATCH 043/265] [3.11] Remove unused `SPHINXLINT` var from `Doc/Makefile`. (GH-110570) (#110584) Remove unused `SPHINXLINT` var from `Doc/Makefile`. (GH-110570) Remove unused `SPHINXLINT` var. (cherry picked from commit bdbe43c7d0ad5ebda0232a4ab39689ea79a9733a) Co-authored-by: Ezio Melotti --- Doc/Makefile | 1 - 1 file changed, 1 deletion(-) diff --git a/Doc/Makefile b/Doc/Makefile index 55be2b2dc5c1e6..ca82353eb454f3 100644 --- a/Doc/Makefile +++ b/Doc/Makefile @@ -7,7 +7,6 @@ PYTHON = python3 VENVDIR = ./venv SPHINXBUILD = PATH=$(VENVDIR)/bin:$$PATH sphinx-build -SPHINXLINT = PATH=$(VENVDIR)/bin:$$PATH sphinx-lint BLURB = PATH=$(VENVDIR)/bin:$$PATH blurb JOBS = auto PAPER = From 2943bae7f1faea26770ee9d23a1893cc22e144e7 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 10 Oct 2023 10:24:17 +0200 Subject: [PATCH 044/265] [3.11] Add some 'meta hooks' to our pre-commit config (GH-110587) (#110600) Add some 'meta hooks' to our pre-commit config (GH-110587) (cherry picked from commit d5ec77fafd352b4eb290b86d70e4d0b4673459eb) Co-authored-by: Alex Waygood --- .pre-commit-config.yaml | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 3cff5658ba4e81..4c42dbb3c9750b 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -23,3 +23,8 @@ repos: args: [--enable=default-role] files: ^Doc/|^Misc/NEWS.d/next/ types: [rst] + + - repo: meta + hooks: + - id: check-hooks-apply + - id: check-useless-excludes From d099defc63681a9134771f768059066116122599 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 10 Oct 2023 10:36:28 +0200 Subject: [PATCH 045/265] [3.11] gh-109408: Add the docs whitespace check from patchcheck to pre-commit (GH-109854) (#110595) gh-109408: Add the docs whitespace check from patchcheck to pre-commit (GH-109854) (cherry picked from commit 7426ed0347d66f7ef61ea7ae6c3163258b8fb128) Co-authored-by: Hugo van Kemenade Co-authored-by: Alex Waygood Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com> --- .pre-commit-config.yaml | 2 +- Tools/scripts/patchcheck.py | 44 ++++++++++--------------------------- 2 files changed, 13 insertions(+), 33 deletions(-) diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 4c42dbb3c9750b..141d12b636a618 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -14,7 +14,7 @@ repos: exclude: ^Lib/test/test_tomllib/ - id: check-yaml - id: trailing-whitespace - types_or: [c, python, rst] + types_or: [c, inc, python, rst] - repo: https://github.com/sphinx-contrib/sphinx-lint rev: v0.6.8 diff --git a/Tools/scripts/patchcheck.py b/Tools/scripts/patchcheck.py index c2dceea2a2c634..c53dd45d8601d3 100755 --- a/Tools/scripts/patchcheck.py +++ b/Tools/scripts/patchcheck.py @@ -30,7 +30,8 @@ def get_python_source_dir(): def n_files_str(count): """Return 'N file(s)' with the proper plurality on 'file'.""" - return "{} file{}".format(count, "s" if count != 1 else "") + s = "s" if count != 1 else "" + return f"{count} file{s}" def status(message, modal=False, info=None): @@ -84,7 +85,7 @@ def get_git_remote_default_branch(remote_name): It is typically called 'main', but may differ """ - cmd = "git remote show {}".format(remote_name).split() + cmd = f"git remote show {remote_name}".split() env = os.environ.copy() env['LANG'] = 'C' try: @@ -170,9 +171,9 @@ def report_modified_files(file_paths): if count == 0: return n_files_str(count) else: - lines = ["{}:".format(n_files_str(count))] + lines = [f"{n_files_str(count)}:"] for path in file_paths: - lines.append(" {}".format(path)) + lines.append(f" {path}") return "\n".join(lines) @@ -199,27 +200,6 @@ def normalize_c_whitespace(file_paths): return fixed -ws_re = re.compile(br'\s+(\r?\n)$') - -@status("Fixing docs whitespace", info=report_modified_files) -def normalize_docs_whitespace(file_paths): - fixed = [] - for path in file_paths: - abspath = os.path.join(SRCDIR, path) - try: - with open(abspath, 'rb') as f: - lines = f.readlines() - new_lines = [ws_re.sub(br'\1', line) for line in lines] - if new_lines != lines: - shutil.copyfile(abspath, abspath + '.bak') - with open(abspath, 'wb') as f: - f.writelines(new_lines) - fixed.append(path) - except Exception as err: - print('Cannot fix %s: %s' % (path, err)) - return fixed - - @status("Docs modified", modal=True) def docs_modified(file_paths): """Report if any file in the Doc directory has been changed.""" @@ -238,6 +218,7 @@ def reported_news(file_paths): return any(p.startswith(os.path.join('Misc', 'NEWS.d', 'next')) for p in file_paths) + @status("configure regenerated", modal=True, info=str) def regenerated_configure(file_paths): """Check if configure has been regenerated.""" @@ -246,6 +227,7 @@ def regenerated_configure(file_paths): else: return "not needed" + @status("pyconfig.h.in regenerated", modal=True, info=str) def regenerated_pyconfig_h_in(file_paths): """Check if pyconfig.h.in has been regenerated.""" @@ -254,6 +236,7 @@ def regenerated_pyconfig_h_in(file_paths): else: return "not needed" + def ci(pull_request): if pull_request == 'false': print('Not a pull request; skipping') @@ -262,19 +245,18 @@ def ci(pull_request): file_paths = changed_files(base_branch) python_files = [fn for fn in file_paths if fn.endswith('.py')] c_files = [fn for fn in file_paths if fn.endswith(('.c', '.h'))] - doc_files = [fn for fn in file_paths if fn.startswith('Doc') and - fn.endswith(('.rst', '.inc'))] fixed = [] fixed.extend(normalize_whitespace(python_files)) fixed.extend(normalize_c_whitespace(c_files)) - fixed.extend(normalize_docs_whitespace(doc_files)) if not fixed: print('No whitespace issues found') else: - print(f'Please fix the {len(fixed)} file(s) with whitespace issues') - print('(on UNIX you can run `make patchcheck` to make the fixes)') + count = len(fixed) + print(f'Please fix the {n_files_str(count)} with whitespace issues') + print('(on Unix you can run `make patchcheck` to make the fixes)') sys.exit(1) + def main(): base_branch = get_base_branch() file_paths = changed_files(base_branch) @@ -287,8 +269,6 @@ def main(): normalize_whitespace(python_files) # C rules enforcement. normalize_c_whitespace(c_files) - # Doc whitespace enforcement. - normalize_docs_whitespace(doc_files) # Docs updated. docs_modified(doc_files) # Misc/ACKS changed. From 2b3a4182798bc95a426e55d1f9f7199e2f2b5662 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 10 Oct 2023 11:12:52 +0200 Subject: [PATCH 046/265] [3.11] gh-110378: Close invalid generators in contextmanager and asynccontextmanager (GH-110499) (#110589) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit contextmanager and asynccontextmanager context managers now close an invalid underlying generator object that yields more then one value. (cherry picked from commit 96fed66a65097eac2dc528ce29c9ba676bb07689) Co-authored-by: Serhiy Storchaka Co-authored-by: Łukasz Langa --- Lib/contextlib.py | 20 ++++++++++++++---- Lib/test/test_contextlib.py | 21 ++++++++++++++++--- ...-10-07-13-50-12.gh-issue-110378.Y4L8fl.rst | 3 +++ 3 files changed, 37 insertions(+), 7 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-10-07-13-50-12.gh-issue-110378.Y4L8fl.rst diff --git a/Lib/contextlib.py b/Lib/contextlib.py index 58e9a498878d01..4a338f5c637db5 100644 --- a/Lib/contextlib.py +++ b/Lib/contextlib.py @@ -145,7 +145,10 @@ def __exit__(self, typ, value, traceback): except StopIteration: return False else: - raise RuntimeError("generator didn't stop") + try: + raise RuntimeError("generator didn't stop") + finally: + self.gen.close() else: if value is None: # Need to force instantiation so we can reliably @@ -187,7 +190,10 @@ def __exit__(self, typ, value, traceback): raise exc.__traceback__ = traceback return False - raise RuntimeError("generator didn't stop after throw()") + try: + raise RuntimeError("generator didn't stop after throw()") + finally: + self.gen.close() class _AsyncGeneratorContextManager( _GeneratorContextManagerBase, @@ -212,7 +218,10 @@ async def __aexit__(self, typ, value, traceback): except StopAsyncIteration: return False else: - raise RuntimeError("generator didn't stop") + try: + raise RuntimeError("generator didn't stop") + finally: + await self.gen.aclose() else: if value is None: # Need to force instantiation so we can reliably @@ -254,7 +263,10 @@ async def __aexit__(self, typ, value, traceback): raise exc.__traceback__ = traceback return False - raise RuntimeError("generator didn't stop after athrow()") + try: + raise RuntimeError("generator didn't stop after athrow()") + finally: + await self.gen.aclose() def contextmanager(func): diff --git a/Lib/test/test_contextlib.py b/Lib/test/test_contextlib.py index ec06785b5667a6..093f2593a4a464 100644 --- a/Lib/test/test_contextlib.py +++ b/Lib/test/test_contextlib.py @@ -156,9 +156,24 @@ def whoo(): yield ctx = whoo() ctx.__enter__() - self.assertRaises( - RuntimeError, ctx.__exit__, TypeError, TypeError("foo"), None - ) + with self.assertRaises(RuntimeError): + ctx.__exit__(TypeError, TypeError("foo"), None) + if support.check_impl_detail(cpython=True): + # The "gen" attribute is an implementation detail. + self.assertFalse(ctx.gen.gi_suspended) + + def test_contextmanager_trap_second_yield(self): + @contextmanager + def whoo(): + yield + yield + ctx = whoo() + ctx.__enter__() + with self.assertRaises(RuntimeError): + ctx.__exit__(None, None, None) + if support.check_impl_detail(cpython=True): + # The "gen" attribute is an implementation detail. + self.assertFalse(ctx.gen.gi_suspended) def test_contextmanager_except(self): state = [] diff --git a/Misc/NEWS.d/next/Library/2023-10-07-13-50-12.gh-issue-110378.Y4L8fl.rst b/Misc/NEWS.d/next/Library/2023-10-07-13-50-12.gh-issue-110378.Y4L8fl.rst new file mode 100644 index 00000000000000..ef5395fc3c6420 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-10-07-13-50-12.gh-issue-110378.Y4L8fl.rst @@ -0,0 +1,3 @@ +:func:`~contextlib.contextmanager` and +:func:`~contextlib.asynccontextmanager` context managers now close an invalid +underlying generator object that yields more then one value. From 194272179dc2cbe67358fc0ebde73dea90bc5a1b Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 10 Oct 2023 12:06:18 +0200 Subject: [PATCH 047/265] [3.11] gh-78469: Declare missing sethostname for Solaris 10 (GH-109447) (#110581) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Add OS version specific macro for Solaris: Py_SUNOS_VERSION. (cherry picked from commit 3b1580af07c0ce90d1c2073ab087772283d7e3b9) Co-authored-by: Jakub Kulík --- Modules/socketmodule.c | 5 +++-- configure | 11 +++++++++++ configure.ac | 8 ++++++++ pyconfig.h.in | 3 +++ 4 files changed, 25 insertions(+), 2 deletions(-) diff --git a/Modules/socketmodule.c b/Modules/socketmodule.c index db3e0519c63f3d..8d9d3c3e1cd3e3 100644 --- a/Modules/socketmodule.c +++ b/Modules/socketmodule.c @@ -5534,8 +5534,9 @@ socket_sethostname(PyObject *self, PyObject *args) Py_buffer buf; int res, flag = 0; -#ifdef _AIX -/* issue #18259, not declared in any useful header file */ +#if defined(_AIX) || (defined(__sun) && defined(__SVR4) && Py_SUNOS_VERSION <= 510) +/* issue #18259, sethostname is not declared in any useful header file on AIX + * the same is true for Solaris 10 */ extern int sethostname(const char *, size_t); #endif diff --git a/configure b/configure index b294f93a5564b4..8d2f3f4cc0fa5f 100755 --- a/configure +++ b/configure @@ -3860,6 +3860,17 @@ then darwin*) MACHDEP="darwin";; '') MACHDEP="unknown";; esac + + if test "$ac_sys_system" = "SunOS"; then + # For Solaris, there isn't an OS version specific macro defined + # in most compilers, so we define one here. + SUNOS_VERSION=`echo $ac_sys_release | sed -e 's!\.\(0-9\)$!.0\1!g' | tr -d '.'` + +cat >>confdefs.h <<_ACEOF +#define Py_SUNOS_VERSION $SUNOS_VERSION +_ACEOF + + fi fi { $as_echo "$as_me:${as_lineno-$LINENO}: result: \"$MACHDEP\"" >&5 $as_echo "\"$MACHDEP\"" >&6; } diff --git a/configure.ac b/configure.ac index 629b7b76c3c315..52d5c1f7dd3593 100644 --- a/configure.ac +++ b/configure.ac @@ -582,6 +582,14 @@ then darwin*) MACHDEP="darwin";; '') MACHDEP="unknown";; esac + + if test "$ac_sys_system" = "SunOS"; then + # For Solaris, there isn't an OS version specific macro defined + # in most compilers, so we define one here. + SUNOS_VERSION=`echo $ac_sys_release | sed -e 's!\.\([0-9]\)$!.0\1!g' | tr -d '.'` + AC_DEFINE_UNQUOTED([Py_SUNOS_VERSION], [$SUNOS_VERSION], + [The version of SunOS/Solaris as reported by `uname -r' without the dot.]) + fi fi AC_MSG_RESULT("$MACHDEP") diff --git a/pyconfig.h.in b/pyconfig.h.in index 94d02e14c444ac..a8c35bba448ea8 100644 --- a/pyconfig.h.in +++ b/pyconfig.h.in @@ -1595,6 +1595,9 @@ /* Define if you want to enable internal statistics gathering. */ #undef Py_STATS +/* The version of SunOS/Solaris as reported by `uname -r' without the dot. */ +#undef Py_SUNOS_VERSION + /* Define if you want to enable tracing references for debugging purpose */ #undef Py_TRACE_REFS From 7fefed091ad2be083d353446c0631d2739cb0ead Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 10 Oct 2023 12:48:07 +0200 Subject: [PATCH 048/265] [3.11] gh-110590: Fix a bug where _sre.compile would overwrite exceptions (GH-110591) (#110614) TypeError would be overwritten by OverflowError if 'code' param contained non-ints. (cherry picked from commit 344d3a222a7864f8157773749bdd77d1c9dfc1e6) Co-authored-by: Nikita Sobolev --- Lib/test/test_re.py | 3 +++ .../Library/2023-10-10-10-46-55.gh-issue-110590.fatz-h.rst | 3 +++ Modules/_sre/sre.c | 3 +++ 3 files changed, 9 insertions(+) create mode 100644 Misc/NEWS.d/next/Library/2023-10-10-10-46-55.gh-issue-110590.fatz-h.rst diff --git a/Lib/test/test_re.py b/Lib/test/test_re.py index f4d64dc9fcf2ae..b41bcd49385453 100644 --- a/Lib/test/test_re.py +++ b/Lib/test/test_re.py @@ -2725,6 +2725,9 @@ def test_dealloc(self): _sre.compile("abc", 0, [long_overflow], 0, {}, ()) with self.assertRaises(TypeError): _sre.compile({}, 0, [], 0, [], []) + # gh-110590: `TypeError` was overwritten with `OverflowError`: + with self.assertRaises(TypeError): + _sre.compile('', 0, ['abc'], 0, {}, ()) @cpython_only def test_repeat_minmax_overflow_maxrepeat(self): diff --git a/Misc/NEWS.d/next/Library/2023-10-10-10-46-55.gh-issue-110590.fatz-h.rst b/Misc/NEWS.d/next/Library/2023-10-10-10-46-55.gh-issue-110590.fatz-h.rst new file mode 100644 index 00000000000000..20dc3fff205994 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-10-10-10-46-55.gh-issue-110590.fatz-h.rst @@ -0,0 +1,3 @@ +Fix a bug in :meth:`!_sre.compile` where :exc:`TypeError` +would be overwritten by :exc:`OverflowError` when +the *code* argument was a list of non-ints. diff --git a/Modules/_sre/sre.c b/Modules/_sre/sre.c index 448e761c988ca8..81629f0007e6a1 100644 --- a/Modules/_sre/sre.c +++ b/Modules/_sre/sre.c @@ -1437,6 +1437,9 @@ _sre_compile_impl(PyObject *module, PyObject *pattern, int flags, for (i = 0; i < n; i++) { PyObject *o = PyList_GET_ITEM(code, i); unsigned long value = PyLong_AsUnsignedLong(o); + if (value == (unsigned long)-1 && PyErr_Occurred()) { + break; + } self->code[i] = (SRE_CODE) value; if ((unsigned long) self->code[i] != value) { PyErr_SetString(PyExc_OverflowError, From bf1753bb4c75f2dc8f82ee3409dce024ccc1c4de Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 10 Oct 2023 13:12:40 +0200 Subject: [PATCH 049/265] [3.11] gh-110378: Fix test_async_gen_propagates_generator_exit in test_contextlib_async (GH-110500) (#110611) It now fails if the original bug is not fixed, and no longer produce ResourceWarning with fixed code. (cherry picked from commit 5aa62a8de15212577a13966710b3aede46e93824) Co-authored-by: Serhiy Storchaka --- Lib/test/test_contextlib_async.py | 14 +++++--------- 1 file changed, 5 insertions(+), 9 deletions(-) diff --git a/Lib/test/test_contextlib_async.py b/Lib/test/test_contextlib_async.py index 3d43ed0fcab168..438bb2dcfcc5eb 100644 --- a/Lib/test/test_contextlib_async.py +++ b/Lib/test/test_contextlib_async.py @@ -49,15 +49,11 @@ async def gen(): async with ctx(): yield 11 - ret = [] - exc = ValueError(22) - with self.assertRaises(ValueError): - async with ctx(): - async for val in gen(): - ret.append(val) - raise exc - - self.assertEqual(ret, [11]) + g = gen() + async for val in g: + self.assertEqual(val, 11) + break + await g.aclose() def test_exit_is_abstract(self): class MissingAexit(AbstractAsyncContextManager): From de62c2c1b3c2b75c36425c32b5536a17a3595a9b Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 10 Oct 2023 14:10:33 +0200 Subject: [PATCH 050/265] [3.11] gh-101100: Fix sphinx warnings in `library/socketserver.rst` (GH-110207) (GH-110624) (cherry picked from commit 756062b296df6242ba324e4cdc8f3e38bfc83617) Co-authored-by: Nikita Sobolev --- Doc/library/socketserver.rst | 77 ++++++++++++++++++++++++------------ Doc/tools/.nitignore | 1 - 2 files changed, 52 insertions(+), 26 deletions(-) diff --git a/Doc/library/socketserver.rst b/Doc/library/socketserver.rst index a409c99b1f6a45..7540375b8d1f50 100644 --- a/Doc/library/socketserver.rst +++ b/Doc/library/socketserver.rst @@ -116,23 +116,28 @@ server is the address family. :class:`ForkingMixIn` and the Forking classes mentioned below are only available on POSIX platforms that support :func:`~os.fork`. - :meth:`socketserver.ForkingMixIn.server_close` waits until all child - processes complete, except if - :attr:`socketserver.ForkingMixIn.block_on_close` attribute is false. + .. attribute:: block_on_close - :meth:`socketserver.ThreadingMixIn.server_close` waits until all non-daemon - threads complete, except if - :attr:`socketserver.ThreadingMixIn.block_on_close` attribute is false. Use - daemonic threads by setting - :data:`ThreadingMixIn.daemon_threads` to ``True`` to not wait until threads - complete. + :meth:`ForkingMixIn.server_close ` + waits until all child processes complete, except if + :attr:`block_on_close` attribute is ``False``. + + :meth:`ThreadingMixIn.server_close ` + waits until all non-daemon threads complete, except if + :attr:`block_on_close` attribute is ``False``. + + .. attribute:: daemon_threads + + For :class:`ThreadingMixIn` use daemonic threads by setting + :data:`ThreadingMixIn.daemon_threads ` + to ``True`` to not wait until threads complete. .. versionchanged:: 3.7 - :meth:`socketserver.ForkingMixIn.server_close` and - :meth:`socketserver.ThreadingMixIn.server_close` now waits until all + :meth:`ForkingMixIn.server_close ` and + :meth:`ThreadingMixIn.server_close ` now waits until all child processes and non-daemonic threads complete. - Add a new :attr:`socketserver.ForkingMixIn.block_on_close` class + Add a new :attr:`ForkingMixIn.block_on_close ` class attribute to opt-in for the pre-3.7 behaviour. @@ -406,13 +411,13 @@ Request Handler Objects This function must do all the work required to service a request. The default implementation does nothing. Several instance attributes are - available to it; the request is available as :attr:`self.request`; the client - address as :attr:`self.client_address`; and the server instance as - :attr:`self.server`, in case it needs access to per-server information. + available to it; the request is available as :attr:`request`; the client + address as :attr:`client_address`; and the server instance as + :attr:`server`, in case it needs access to per-server information. - The type of :attr:`self.request` is different for datagram or stream - services. For stream services, :attr:`self.request` is a socket object; for - datagram services, :attr:`self.request` is a pair of string and socket. + The type of :attr:`request` is different for datagram or stream + services. For stream services, :attr:`request` is a socket object; for + datagram services, :attr:`request` is a pair of string and socket. .. method:: finish() @@ -422,20 +427,42 @@ Request Handler Objects raises an exception, this function will not be called. + .. attribute:: request + + The *new* :class:`socket.socket` object + to be used to communicate with the client. + + + .. attribute:: client_address + + Client address returned by :meth:`BaseServer.get_request`. + + + .. attribute:: server + + :class:`BaseServer` object used for handling the request. + + .. class:: StreamRequestHandler DatagramRequestHandler These :class:`BaseRequestHandler` subclasses override the :meth:`~BaseRequestHandler.setup` and :meth:`~BaseRequestHandler.finish` - methods, and provide :attr:`self.rfile` and :attr:`self.wfile` attributes. - The :attr:`self.rfile` and :attr:`self.wfile` attributes can be - read or written, respectively, to get the request data or return data - to the client. - The :attr:`!rfile` attributes support the :class:`io.BufferedIOBase` readable interface, - and :attr:`!wfile` attributes support the :class:`!io.BufferedIOBase` writable interface. + methods, and provide :attr:`rfile` and :attr:`wfile` attributes. + + .. attribute:: rfile + + A file object from which receives the request is read. + Support the :class:`io.BufferedIOBase` readable interface. + + .. attribute:: wfile + + A file object to which the reply is written. + Support the :class:`io.BufferedIOBase` writable interface + .. versionchanged:: 3.6 - :attr:`StreamRequestHandler.wfile` also supports the + :attr:`wfile` also supports the :class:`io.BufferedIOBase` writable interface. diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index 8e1c28a4c7b424..34975d5312f6e7 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -112,7 +112,6 @@ Doc/library/shelve.rst Doc/library/signal.rst Doc/library/smtplib.rst Doc/library/socket.rst -Doc/library/socketserver.rst Doc/library/ssl.rst Doc/library/stdtypes.rst Doc/library/string.rst From e511f544ea444a5653f322d01507563d9cf66268 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 10 Oct 2023 14:11:06 +0200 Subject: [PATCH 051/265] [3.11] gh-81002: Add tests for termios (GH-110386) (GH-110620) (cherry picked from commit 92a9e980245156bf75ede0869f8ba9512e04d2eb) Co-authored-by: Serhiy Storchaka --- Lib/test/test_termios.py | 220 ++++++++++++++++++ ...3-10-05-13-46-50.gh-issue-81002.bOcuV6.rst | 1 + 2 files changed, 221 insertions(+) create mode 100644 Lib/test/test_termios.py create mode 100644 Misc/NEWS.d/next/Tests/2023-10-05-13-46-50.gh-issue-81002.bOcuV6.rst diff --git a/Lib/test/test_termios.py b/Lib/test/test_termios.py new file mode 100644 index 00000000000000..58698ffac2d981 --- /dev/null +++ b/Lib/test/test_termios.py @@ -0,0 +1,220 @@ +import errno +import os +import sys +import tempfile +import unittest +from test.support.import_helper import import_module + +termios = import_module('termios') + + +@unittest.skipUnless(hasattr(os, 'openpty'), "need os.openpty()") +class TestFunctions(unittest.TestCase): + + def setUp(self): + master_fd, self.fd = os.openpty() + self.addCleanup(os.close, master_fd) + self.stream = self.enterContext(open(self.fd, 'wb', buffering=0)) + tmp = self.enterContext(tempfile.TemporaryFile(mode='wb', buffering=0)) + self.bad_fd = tmp.fileno() + + def assertRaisesTermiosError(self, errno, callable, *args): + with self.assertRaises(termios.error) as cm: + callable(*args) + self.assertEqual(cm.exception.args[0], errno) + + def test_tcgetattr(self): + attrs = termios.tcgetattr(self.fd) + self.assertIsInstance(attrs, list) + self.assertEqual(len(attrs), 7) + for i in range(6): + self.assertIsInstance(attrs[i], int) + iflag, oflag, cflag, lflag, ispeed, ospeed, cc = attrs + self.assertIsInstance(cc, list) + self.assertEqual(len(cc), termios.NCCS) + for i, x in enumerate(cc): + if ((lflag & termios.ICANON) == 0 and + (i == termios.VMIN or i == termios.VTIME)): + self.assertIsInstance(x, int) + else: + self.assertIsInstance(x, bytes) + self.assertEqual(len(x), 1) + self.assertEqual(termios.tcgetattr(self.stream), attrs) + + def test_tcgetattr_errors(self): + self.assertRaisesTermiosError(errno.ENOTTY, termios.tcgetattr, self.bad_fd) + self.assertRaises(ValueError, termios.tcgetattr, -1) + self.assertRaises(OverflowError, termios.tcgetattr, 2**1000) + self.assertRaises(TypeError, termios.tcgetattr, object()) + self.assertRaises(TypeError, termios.tcgetattr) + + def test_tcsetattr(self): + attrs = termios.tcgetattr(self.fd) + termios.tcsetattr(self.fd, termios.TCSANOW, attrs) + termios.tcsetattr(self.fd, termios.TCSADRAIN, attrs) + termios.tcsetattr(self.fd, termios.TCSAFLUSH, attrs) + termios.tcsetattr(self.stream, termios.TCSANOW, attrs) + + def test_tcsetattr_errors(self): + attrs = termios.tcgetattr(self.fd) + self.assertRaises(TypeError, termios.tcsetattr, self.fd, termios.TCSANOW, tuple(attrs)) + self.assertRaises(TypeError, termios.tcsetattr, self.fd, termios.TCSANOW, attrs[:-1]) + self.assertRaises(TypeError, termios.tcsetattr, self.fd, termios.TCSANOW, attrs + [0]) + for i in range(6): + attrs2 = attrs[:] + attrs2[i] = 2**1000 + self.assertRaises(OverflowError, termios.tcsetattr, self.fd, termios.TCSANOW, attrs2) + attrs2[i] = object() + self.assertRaises(TypeError, termios.tcsetattr, self.fd, termios.TCSANOW, attrs2) + self.assertRaises(TypeError, termios.tcsetattr, self.fd, termios.TCSANOW, attrs[:-1] + [attrs[-1][:-1]]) + self.assertRaises(TypeError, termios.tcsetattr, self.fd, termios.TCSANOW, attrs[:-1] + [attrs[-1] + [b'\0']]) + for i in range(len(attrs[-1])): + attrs2 = attrs[:] + attrs2[-1] = attrs2[-1][:] + attrs2[-1][i] = 2**1000 + self.assertRaises(OverflowError, termios.tcsetattr, self.fd, termios.TCSANOW, attrs2) + attrs2[-1][i] = object() + self.assertRaises(TypeError, termios.tcsetattr, self.fd, termios.TCSANOW, attrs2) + attrs2[-1][i] = b'' + self.assertRaises(TypeError, termios.tcsetattr, self.fd, termios.TCSANOW, attrs2) + attrs2[-1][i] = b'\0\0' + self.assertRaises(TypeError, termios.tcsetattr, self.fd, termios.TCSANOW, attrs2) + self.assertRaises(TypeError, termios.tcsetattr, self.fd, termios.TCSANOW, object()) + self.assertRaises(TypeError, termios.tcsetattr, self.fd, termios.TCSANOW) + self.assertRaisesTermiosError(errno.EINVAL, termios.tcsetattr, self.fd, -1, attrs) + self.assertRaises(OverflowError, termios.tcsetattr, self.fd, 2**1000, attrs) + self.assertRaises(TypeError, termios.tcsetattr, self.fd, object(), attrs) + self.assertRaisesTermiosError(errno.ENOTTY, termios.tcsetattr, self.bad_fd, termios.TCSANOW, attrs) + self.assertRaises(ValueError, termios.tcsetattr, -1, termios.TCSANOW, attrs) + self.assertRaises(OverflowError, termios.tcsetattr, 2**1000, termios.TCSANOW, attrs) + self.assertRaises(TypeError, termios.tcsetattr, object(), termios.TCSANOW, attrs) + self.assertRaises(TypeError, termios.tcsetattr, self.fd, termios.TCSANOW) + + def test_tcsendbreak(self): + try: + termios.tcsendbreak(self.fd, 1) + except termios.error as exc: + if exc.args[0] == errno.ENOTTY and sys.platform.startswith('freebsd'): + self.skipTest('termios.tcsendbreak() is not supported ' + 'with pseudo-terminals (?) on this platform') + raise + termios.tcsendbreak(self.stream, 1) + + def test_tcsendbreak_errors(self): + self.assertRaises(OverflowError, termios.tcsendbreak, self.fd, 2**1000) + self.assertRaises(TypeError, termios.tcsendbreak, self.fd, 0.0) + self.assertRaises(TypeError, termios.tcsendbreak, self.fd, object()) + self.assertRaisesTermiosError(errno.ENOTTY, termios.tcsendbreak, self.bad_fd, 0) + self.assertRaises(ValueError, termios.tcsendbreak, -1, 0) + self.assertRaises(OverflowError, termios.tcsendbreak, 2**1000, 0) + self.assertRaises(TypeError, termios.tcsendbreak, object(), 0) + self.assertRaises(TypeError, termios.tcsendbreak, self.fd) + + def test_tcdrain(self): + termios.tcdrain(self.fd) + termios.tcdrain(self.stream) + + def test_tcdrain_errors(self): + self.assertRaisesTermiosError(errno.ENOTTY, termios.tcdrain, self.bad_fd) + self.assertRaises(ValueError, termios.tcdrain, -1) + self.assertRaises(OverflowError, termios.tcdrain, 2**1000) + self.assertRaises(TypeError, termios.tcdrain, object()) + self.assertRaises(TypeError, termios.tcdrain) + + def test_tcflush(self): + termios.tcflush(self.fd, termios.TCIFLUSH) + termios.tcflush(self.fd, termios.TCOFLUSH) + termios.tcflush(self.fd, termios.TCIOFLUSH) + + def test_tcflush_errors(self): + self.assertRaisesTermiosError(errno.EINVAL, termios.tcflush, self.fd, -1) + self.assertRaises(OverflowError, termios.tcflush, self.fd, 2**1000) + self.assertRaises(TypeError, termios.tcflush, self.fd, object()) + self.assertRaisesTermiosError(errno.ENOTTY, termios.tcflush, self.bad_fd, termios.TCIFLUSH) + self.assertRaises(ValueError, termios.tcflush, -1, termios.TCIFLUSH) + self.assertRaises(OverflowError, termios.tcflush, 2**1000, termios.TCIFLUSH) + self.assertRaises(TypeError, termios.tcflush, object(), termios.TCIFLUSH) + self.assertRaises(TypeError, termios.tcflush, self.fd) + + def test_tcflow(self): + termios.tcflow(self.fd, termios.TCOOFF) + termios.tcflow(self.fd, termios.TCOON) + termios.tcflow(self.fd, termios.TCIOFF) + termios.tcflow(self.fd, termios.TCION) + + def test_tcflow_errors(self): + self.assertRaisesTermiosError(errno.EINVAL, termios.tcflow, self.fd, -1) + self.assertRaises(OverflowError, termios.tcflow, self.fd, 2**1000) + self.assertRaises(TypeError, termios.tcflow, self.fd, object()) + self.assertRaisesTermiosError(errno.ENOTTY, termios.tcflow, self.bad_fd, termios.TCOON) + self.assertRaises(ValueError, termios.tcflow, -1, termios.TCOON) + self.assertRaises(OverflowError, termios.tcflow, 2**1000, termios.TCOON) + self.assertRaises(TypeError, termios.tcflow, object(), termios.TCOON) + self.assertRaises(TypeError, termios.tcflow, self.fd) + + def test_tcgetwinsize(self): + size = termios.tcgetwinsize(self.fd) + self.assertIsInstance(size, tuple) + self.assertEqual(len(size), 2) + self.assertIsInstance(size[0], int) + self.assertIsInstance(size[1], int) + self.assertEqual(termios.tcgetwinsize(self.stream), size) + + def test_tcgetwinsize_errors(self): + self.assertRaisesTermiosError(errno.ENOTTY, termios.tcgetwinsize, self.bad_fd) + self.assertRaises(ValueError, termios.tcgetwinsize, -1) + self.assertRaises(OverflowError, termios.tcgetwinsize, 2**1000) + self.assertRaises(TypeError, termios.tcgetwinsize, object()) + self.assertRaises(TypeError, termios.tcgetwinsize) + + def test_tcsetwinsize(self): + size = termios.tcgetwinsize(self.fd) + termios.tcsetwinsize(self.fd, size) + termios.tcsetwinsize(self.fd, list(size)) + termios.tcsetwinsize(self.stream, size) + + def test_tcsetwinsize_errors(self): + size = termios.tcgetwinsize(self.fd) + self.assertRaises(TypeError, termios.tcsetwinsize, self.fd, size[:-1]) + self.assertRaises(TypeError, termios.tcsetwinsize, self.fd, size + (0,)) + self.assertRaises(TypeError, termios.tcsetwinsize, self.fd, object()) + self.assertRaises(OverflowError, termios.tcsetwinsize, self.fd, (size[0], 2**1000)) + self.assertRaises(TypeError, termios.tcsetwinsize, self.fd, (size[0], float(size[1]))) + self.assertRaises(TypeError, termios.tcsetwinsize, self.fd, (size[0], object())) + self.assertRaises(OverflowError, termios.tcsetwinsize, self.fd, (2**1000, size[1])) + self.assertRaises(TypeError, termios.tcsetwinsize, self.fd, (float(size[0]), size[1])) + self.assertRaises(TypeError, termios.tcsetwinsize, self.fd, (object(), size[1])) + self.assertRaisesTermiosError(errno.ENOTTY, termios.tcsetwinsize, self.bad_fd, size) + self.assertRaises(ValueError, termios.tcsetwinsize, -1, size) + self.assertRaises(OverflowError, termios.tcsetwinsize, 2**1000, size) + self.assertRaises(TypeError, termios.tcsetwinsize, object(), size) + self.assertRaises(TypeError, termios.tcsetwinsize, self.fd) + + +class TestModule(unittest.TestCase): + def test_constants(self): + self.assertIsInstance(termios.B0, int) + self.assertIsInstance(termios.B38400, int) + self.assertIsInstance(termios.TCSANOW, int) + self.assertIsInstance(termios.TCSADRAIN, int) + self.assertIsInstance(termios.TCSAFLUSH, int) + self.assertIsInstance(termios.TCIFLUSH, int) + self.assertIsInstance(termios.TCOFLUSH, int) + self.assertIsInstance(termios.TCIOFLUSH, int) + self.assertIsInstance(termios.TCOOFF, int) + self.assertIsInstance(termios.TCOON, int) + self.assertIsInstance(termios.TCIOFF, int) + self.assertIsInstance(termios.TCION, int) + self.assertIsInstance(termios.VTIME, int) + self.assertIsInstance(termios.VMIN, int) + self.assertIsInstance(termios.NCCS, int) + self.assertLess(termios.VTIME, termios.NCCS) + self.assertLess(termios.VMIN, termios.NCCS) + + def test_exception(self): + self.assertTrue(issubclass(termios.error, Exception)) + self.assertFalse(issubclass(termios.error, OSError)) + + +if __name__ == '__main__': + unittest.main() diff --git a/Misc/NEWS.d/next/Tests/2023-10-05-13-46-50.gh-issue-81002.bOcuV6.rst b/Misc/NEWS.d/next/Tests/2023-10-05-13-46-50.gh-issue-81002.bOcuV6.rst new file mode 100644 index 00000000000000..d69f6746d9e9aa --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-10-05-13-46-50.gh-issue-81002.bOcuV6.rst @@ -0,0 +1 @@ +Add tests for :mod:`termios`. From f72b18d146739e4a358499e621ef41f50fa2680a Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 10 Oct 2023 15:00:24 +0200 Subject: [PATCH 052/265] [3.11] Don't doubly-parallelise sphinx-lint (GH-110617) (#110627) Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com> --- .pre-commit-config.yaml | 1 + 1 file changed, 1 insertion(+) diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 141d12b636a618..1ae3dd71354e63 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -23,6 +23,7 @@ repos: args: [--enable=default-role] files: ^Doc/|^Misc/NEWS.d/next/ types: [rst] + require_serial: true - repo: meta hooks: From 3788c48d1227a9a6176e461c1bf6975c60afe5cf Mon Sep 17 00:00:00 2001 From: Serhiy Storchaka Date: Tue, 10 Oct 2023 17:27:22 +0300 Subject: [PATCH 053/265] [3.11] gh-110388: Add tests for tty (GH-110394) (GH-110634) (cherry picked from commit 7f702b2) --- Lib/test/test_tty.py | 60 +++++++++++++++++++ ...-10-05-14-22-48.gh-issue-110388.1-HQJO.rst | 1 + 2 files changed, 61 insertions(+) create mode 100644 Lib/test/test_tty.py create mode 100644 Misc/NEWS.d/next/Tests/2023-10-05-14-22-48.gh-issue-110388.1-HQJO.rst diff --git a/Lib/test/test_tty.py b/Lib/test/test_tty.py new file mode 100644 index 00000000000000..ec1f3407cddf84 --- /dev/null +++ b/Lib/test/test_tty.py @@ -0,0 +1,60 @@ +import os +import unittest +from test.support.import_helper import import_module + +termios = import_module('termios') +tty = import_module('tty') + + +@unittest.skipUnless(hasattr(os, 'openpty'), "need os.openpty()") +class TestTty(unittest.TestCase): + + def setUp(self): + master_fd, self.fd = os.openpty() + self.addCleanup(os.close, master_fd) + self.stream = self.enterContext(open(self.fd, 'wb', buffering=0)) + self.fd = self.stream.fileno() + self.mode = termios.tcgetattr(self.fd) + self.addCleanup(termios.tcsetattr, self.fd, termios.TCSANOW, self.mode) + self.addCleanup(termios.tcsetattr, self.fd, termios.TCSAFLUSH, self.mode) + + def check_cbreak(self, mode): + self.assertEqual(mode[3] & termios.ECHO, 0) + self.assertEqual(mode[3] & termios.ICANON, 0) + self.assertEqual(mode[6][termios.VMIN], 1) + self.assertEqual(mode[6][termios.VTIME], 0) + + def check_raw(self, mode): + self.assertEqual(mode[0] & termios.ICRNL, 0) + self.check_cbreak(mode) + self.assertEqual(mode[0] & termios.ISTRIP, 0) + self.assertEqual(mode[0] & termios.ICRNL, 0) + self.assertEqual(mode[1] & termios.OPOST, 0) + self.assertEqual(mode[2] & termios.PARENB, termios.CS8 & termios.PARENB) + self.assertEqual(mode[2] & termios.CSIZE, termios.CS8 & termios.CSIZE) + self.assertEqual(mode[2] & termios.CS8, termios.CS8) + self.assertEqual(mode[3] & termios.ECHO, 0) + self.assertEqual(mode[3] & termios.ICANON, 0) + self.assertEqual(mode[3] & termios.ISIG, 0) + self.assertEqual(mode[6][termios.VMIN], 1) + self.assertEqual(mode[6][termios.VTIME], 0) + + def test_setraw(self): + tty.setraw(self.fd) + mode = termios.tcgetattr(self.fd) + self.check_raw(mode) + tty.setraw(self.fd, termios.TCSANOW) + tty.setraw(self.stream) + tty.setraw(fd=self.fd, when=termios.TCSANOW) + + def test_setcbreak(self): + tty.setcbreak(self.fd) + mode = termios.tcgetattr(self.fd) + self.check_cbreak(mode) + tty.setcbreak(self.fd, termios.TCSANOW) + tty.setcbreak(self.stream) + tty.setcbreak(fd=self.fd, when=termios.TCSANOW) + + +if __name__ == '__main__': + unittest.main() diff --git a/Misc/NEWS.d/next/Tests/2023-10-05-14-22-48.gh-issue-110388.1-HQJO.rst b/Misc/NEWS.d/next/Tests/2023-10-05-14-22-48.gh-issue-110388.1-HQJO.rst new file mode 100644 index 00000000000000..caac41f81547de --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-10-05-14-22-48.gh-issue-110388.1-HQJO.rst @@ -0,0 +1 @@ +Add tests for :mod:`tty`. From 4b67878daa9fee58db80609f851ada9402fcd71c Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 10 Oct 2023 17:03:11 +0200 Subject: [PATCH 054/265] [3.11] gh-110631: Set three-space indents for reST in EditorConfig (GH-110635) (#110638) gh-110631: Set three-space indents for reST in EditorConfig (GH-110635) Set three-space indents in EditorConfig (cherry picked from commit 66a9b1082049855889854bfde617059499c26dd2) Co-authored-by: Hugo van Kemenade --- .editorconfig | 3 +++ 1 file changed, 3 insertions(+) diff --git a/.editorconfig b/.editorconfig index 81445d2d79c739..0169eed951cd3f 100644 --- a/.editorconfig +++ b/.editorconfig @@ -8,5 +8,8 @@ indent_style = space [*.{py,c,cpp,h}] indent_size = 4 +[*.rst] +indent_size = 3 + [*.yml] indent_size = 2 From 28c6cc1928bf517076a44b4112f7f56dae7f076b Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 10 Oct 2023 23:02:21 +0200 Subject: [PATCH 055/265] [3.11] [3.12] gh-108303: Move all certificates to `Lib/test/certdata/` (GH-109489) (GH-109682) (#110646) [3.12] gh-108303: Move all certificates to `Lib/test/certdata/` (GH-109489) (GH-109682) * gh-108303: Move all certificates to `Lib/test/certdata/` (GH-109489) (cherry picked from commit e57ecf6bbc59f999d27b125ea51b042c24a07bd9) Python 3.12 backport: update also `test_nntplib`. (cherry picked from commit c2d542b42cd109d81c0308f9c4437c38ac74d2e0) Co-authored-by: Nikita Sobolev Co-authored-by: T. Wouters --- Lib/test/{ => certdata}/allsans.pem | 0 Lib/test/{ => certdata}/badcert.pem | 0 Lib/test/{ => certdata}/badkey.pem | 0 Lib/test/{ => certdata}/capath/4e1295a3.0 | 0 Lib/test/{ => certdata}/capath/5ed36f99.0 | 0 Lib/test/{ => certdata}/capath/6e88d7b8.0 | 0 Lib/test/{ => certdata}/capath/99d0fa06.0 | 0 Lib/test/{ => certdata}/capath/b1930218.0 | 0 Lib/test/{ => certdata}/capath/ceff1710.0 | 0 Lib/test/{ => certdata}/ffdh3072.pem | 0 Lib/test/{ => certdata}/idnsans.pem | 0 Lib/test/{ => certdata}/keycert.passwd.pem | 0 Lib/test/{ => certdata}/keycert.pem | 0 Lib/test/{ => certdata}/keycert2.pem | 0 Lib/test/{ => certdata}/keycert3.pem | 0 Lib/test/{ => certdata}/keycert4.pem | 0 Lib/test/{ => certdata}/keycertecc.pem | 0 Lib/test/{ => certdata}/make_ssl_certs.py | 0 Lib/test/{ => certdata}/nokia.pem | 0 Lib/test/{ => certdata}/nosan.pem | 0 Lib/test/{ => certdata}/nullbytecert.pem | 0 Lib/test/{ => certdata}/nullcert.pem | 0 Lib/test/{ => certdata}/pycacert.pem | 0 Lib/test/{ => certdata}/pycakey.pem | 0 Lib/test/{ => certdata}/revocation.crl | 0 Lib/test/{ => certdata}/secp384r1.pem | 0 .../selfsigned_pythontestdotnet.pem | 0 Lib/test/{ => certdata}/ssl_cert.pem | 0 Lib/test/{ => certdata}/ssl_key.passwd.pem | 0 Lib/test/{ => certdata}/ssl_key.pem | 0 Lib/test/{ => certdata}/talos-2019-0758.pem | 0 Lib/test/ssl_servers.py | 2 +- Lib/test/test_asyncio/utils.py | 16 ++++++++-------- Lib/test/test_ftplib.py | 4 ++-- Lib/test/test_httplib.py | 8 +++++--- Lib/test/test_imaplib.py | 4 ++-- Lib/test/test_logging.py | 2 +- Lib/test/test_nntplib.py | 2 +- Lib/test/test_poplib.py | 4 ++-- Lib/test/test_ssl.py | 12 ++++++------ Lib/test/test_urllib2_localnet.py | 4 ++-- Makefile.pre.in | 3 ++- 42 files changed, 32 insertions(+), 29 deletions(-) rename Lib/test/{ => certdata}/allsans.pem (100%) rename Lib/test/{ => certdata}/badcert.pem (100%) rename Lib/test/{ => certdata}/badkey.pem (100%) rename Lib/test/{ => certdata}/capath/4e1295a3.0 (100%) rename Lib/test/{ => certdata}/capath/5ed36f99.0 (100%) rename Lib/test/{ => certdata}/capath/6e88d7b8.0 (100%) rename Lib/test/{ => certdata}/capath/99d0fa06.0 (100%) rename Lib/test/{ => certdata}/capath/b1930218.0 (100%) rename Lib/test/{ => certdata}/capath/ceff1710.0 (100%) rename Lib/test/{ => certdata}/ffdh3072.pem (100%) rename Lib/test/{ => certdata}/idnsans.pem (100%) rename Lib/test/{ => certdata}/keycert.passwd.pem (100%) rename Lib/test/{ => certdata}/keycert.pem (100%) rename Lib/test/{ => certdata}/keycert2.pem (100%) rename Lib/test/{ => certdata}/keycert3.pem (100%) rename Lib/test/{ => certdata}/keycert4.pem (100%) rename Lib/test/{ => certdata}/keycertecc.pem (100%) rename Lib/test/{ => certdata}/make_ssl_certs.py (100%) rename Lib/test/{ => certdata}/nokia.pem (100%) rename Lib/test/{ => certdata}/nosan.pem (100%) rename Lib/test/{ => certdata}/nullbytecert.pem (100%) rename Lib/test/{ => certdata}/nullcert.pem (100%) rename Lib/test/{ => certdata}/pycacert.pem (100%) rename Lib/test/{ => certdata}/pycakey.pem (100%) rename Lib/test/{ => certdata}/revocation.crl (100%) rename Lib/test/{ => certdata}/secp384r1.pem (100%) rename Lib/test/{ => certdata}/selfsigned_pythontestdotnet.pem (100%) rename Lib/test/{ => certdata}/ssl_cert.pem (100%) rename Lib/test/{ => certdata}/ssl_key.passwd.pem (100%) rename Lib/test/{ => certdata}/ssl_key.pem (100%) rename Lib/test/{ => certdata}/talos-2019-0758.pem (100%) diff --git a/Lib/test/allsans.pem b/Lib/test/certdata/allsans.pem similarity index 100% rename from Lib/test/allsans.pem rename to Lib/test/certdata/allsans.pem diff --git a/Lib/test/badcert.pem b/Lib/test/certdata/badcert.pem similarity index 100% rename from Lib/test/badcert.pem rename to Lib/test/certdata/badcert.pem diff --git a/Lib/test/badkey.pem b/Lib/test/certdata/badkey.pem similarity index 100% rename from Lib/test/badkey.pem rename to Lib/test/certdata/badkey.pem diff --git a/Lib/test/capath/4e1295a3.0 b/Lib/test/certdata/capath/4e1295a3.0 similarity index 100% rename from Lib/test/capath/4e1295a3.0 rename to Lib/test/certdata/capath/4e1295a3.0 diff --git a/Lib/test/capath/5ed36f99.0 b/Lib/test/certdata/capath/5ed36f99.0 similarity index 100% rename from Lib/test/capath/5ed36f99.0 rename to Lib/test/certdata/capath/5ed36f99.0 diff --git a/Lib/test/capath/6e88d7b8.0 b/Lib/test/certdata/capath/6e88d7b8.0 similarity index 100% rename from Lib/test/capath/6e88d7b8.0 rename to Lib/test/certdata/capath/6e88d7b8.0 diff --git a/Lib/test/capath/99d0fa06.0 b/Lib/test/certdata/capath/99d0fa06.0 similarity index 100% rename from Lib/test/capath/99d0fa06.0 rename to Lib/test/certdata/capath/99d0fa06.0 diff --git a/Lib/test/capath/b1930218.0 b/Lib/test/certdata/capath/b1930218.0 similarity index 100% rename from Lib/test/capath/b1930218.0 rename to Lib/test/certdata/capath/b1930218.0 diff --git a/Lib/test/capath/ceff1710.0 b/Lib/test/certdata/capath/ceff1710.0 similarity index 100% rename from Lib/test/capath/ceff1710.0 rename to Lib/test/certdata/capath/ceff1710.0 diff --git a/Lib/test/ffdh3072.pem b/Lib/test/certdata/ffdh3072.pem similarity index 100% rename from Lib/test/ffdh3072.pem rename to Lib/test/certdata/ffdh3072.pem diff --git a/Lib/test/idnsans.pem b/Lib/test/certdata/idnsans.pem similarity index 100% rename from Lib/test/idnsans.pem rename to Lib/test/certdata/idnsans.pem diff --git a/Lib/test/keycert.passwd.pem b/Lib/test/certdata/keycert.passwd.pem similarity index 100% rename from Lib/test/keycert.passwd.pem rename to Lib/test/certdata/keycert.passwd.pem diff --git a/Lib/test/keycert.pem b/Lib/test/certdata/keycert.pem similarity index 100% rename from Lib/test/keycert.pem rename to Lib/test/certdata/keycert.pem diff --git a/Lib/test/keycert2.pem b/Lib/test/certdata/keycert2.pem similarity index 100% rename from Lib/test/keycert2.pem rename to Lib/test/certdata/keycert2.pem diff --git a/Lib/test/keycert3.pem b/Lib/test/certdata/keycert3.pem similarity index 100% rename from Lib/test/keycert3.pem rename to Lib/test/certdata/keycert3.pem diff --git a/Lib/test/keycert4.pem b/Lib/test/certdata/keycert4.pem similarity index 100% rename from Lib/test/keycert4.pem rename to Lib/test/certdata/keycert4.pem diff --git a/Lib/test/keycertecc.pem b/Lib/test/certdata/keycertecc.pem similarity index 100% rename from Lib/test/keycertecc.pem rename to Lib/test/certdata/keycertecc.pem diff --git a/Lib/test/make_ssl_certs.py b/Lib/test/certdata/make_ssl_certs.py similarity index 100% rename from Lib/test/make_ssl_certs.py rename to Lib/test/certdata/make_ssl_certs.py diff --git a/Lib/test/nokia.pem b/Lib/test/certdata/nokia.pem similarity index 100% rename from Lib/test/nokia.pem rename to Lib/test/certdata/nokia.pem diff --git a/Lib/test/nosan.pem b/Lib/test/certdata/nosan.pem similarity index 100% rename from Lib/test/nosan.pem rename to Lib/test/certdata/nosan.pem diff --git a/Lib/test/nullbytecert.pem b/Lib/test/certdata/nullbytecert.pem similarity index 100% rename from Lib/test/nullbytecert.pem rename to Lib/test/certdata/nullbytecert.pem diff --git a/Lib/test/nullcert.pem b/Lib/test/certdata/nullcert.pem similarity index 100% rename from Lib/test/nullcert.pem rename to Lib/test/certdata/nullcert.pem diff --git a/Lib/test/pycacert.pem b/Lib/test/certdata/pycacert.pem similarity index 100% rename from Lib/test/pycacert.pem rename to Lib/test/certdata/pycacert.pem diff --git a/Lib/test/pycakey.pem b/Lib/test/certdata/pycakey.pem similarity index 100% rename from Lib/test/pycakey.pem rename to Lib/test/certdata/pycakey.pem diff --git a/Lib/test/revocation.crl b/Lib/test/certdata/revocation.crl similarity index 100% rename from Lib/test/revocation.crl rename to Lib/test/certdata/revocation.crl diff --git a/Lib/test/secp384r1.pem b/Lib/test/certdata/secp384r1.pem similarity index 100% rename from Lib/test/secp384r1.pem rename to Lib/test/certdata/secp384r1.pem diff --git a/Lib/test/selfsigned_pythontestdotnet.pem b/Lib/test/certdata/selfsigned_pythontestdotnet.pem similarity index 100% rename from Lib/test/selfsigned_pythontestdotnet.pem rename to Lib/test/certdata/selfsigned_pythontestdotnet.pem diff --git a/Lib/test/ssl_cert.pem b/Lib/test/certdata/ssl_cert.pem similarity index 100% rename from Lib/test/ssl_cert.pem rename to Lib/test/certdata/ssl_cert.pem diff --git a/Lib/test/ssl_key.passwd.pem b/Lib/test/certdata/ssl_key.passwd.pem similarity index 100% rename from Lib/test/ssl_key.passwd.pem rename to Lib/test/certdata/ssl_key.passwd.pem diff --git a/Lib/test/ssl_key.pem b/Lib/test/certdata/ssl_key.pem similarity index 100% rename from Lib/test/ssl_key.pem rename to Lib/test/certdata/ssl_key.pem diff --git a/Lib/test/talos-2019-0758.pem b/Lib/test/certdata/talos-2019-0758.pem similarity index 100% rename from Lib/test/talos-2019-0758.pem rename to Lib/test/certdata/talos-2019-0758.pem diff --git a/Lib/test/ssl_servers.py b/Lib/test/ssl_servers.py index a4bd7455d47e76..15b071e04dda1f 100644 --- a/Lib/test/ssl_servers.py +++ b/Lib/test/ssl_servers.py @@ -14,7 +14,7 @@ here = os.path.dirname(__file__) HOST = socket_helper.HOST -CERTFILE = os.path.join(here, 'keycert.pem') +CERTFILE = os.path.join(here, 'certdata', 'keycert.pem') # This one's based on HTTPServer, which is based on socketserver diff --git a/Lib/test/test_asyncio/utils.py b/Lib/test/test_asyncio/utils.py index 19de8540fdb546..f49db4e37b3592 100644 --- a/Lib/test/test_asyncio/utils.py +++ b/Lib/test/test_asyncio/utils.py @@ -43,21 +43,21 @@ CLOCK_RES = 0.020 -def data_file(filename): +def data_file(*filename): if hasattr(support, 'TEST_HOME_DIR'): - fullname = os.path.join(support.TEST_HOME_DIR, filename) + fullname = os.path.join(support.TEST_HOME_DIR, *filename) if os.path.isfile(fullname): return fullname - fullname = os.path.join(os.path.dirname(__file__), '..', filename) + fullname = os.path.join(os.path.dirname(__file__), '..', *filename) if os.path.isfile(fullname): return fullname - raise FileNotFoundError(filename) + raise FileNotFoundError(os.path.join(filename)) -ONLYCERT = data_file('ssl_cert.pem') -ONLYKEY = data_file('ssl_key.pem') -SIGNED_CERTFILE = data_file('keycert3.pem') -SIGNING_CA = data_file('pycacert.pem') +ONLYCERT = data_file('certdata', 'ssl_cert.pem') +ONLYKEY = data_file('certdata', 'ssl_key.pem') +SIGNED_CERTFILE = data_file('certdata', 'keycert3.pem') +SIGNING_CA = data_file('certdata', 'pycacert.pem') PEERCERT = { 'OCSP': ('http://testca.pythontest.net/testca/ocsp/',), 'caIssuers': ('http://testca.pythontest.net/testca/pycacert.cer',), diff --git a/Lib/test/test_ftplib.py b/Lib/test/test_ftplib.py index 7d6b12ffadcef3..a90a53f278664f 100644 --- a/Lib/test/test_ftplib.py +++ b/Lib/test/test_ftplib.py @@ -327,8 +327,8 @@ def handle_error(self): if ssl is not None: - CERTFILE = os.path.join(os.path.dirname(__file__), "keycert3.pem") - CAFILE = os.path.join(os.path.dirname(__file__), "pycacert.pem") + CERTFILE = os.path.join(os.path.dirname(__file__), "certdata", "keycert3.pem") + CAFILE = os.path.join(os.path.dirname(__file__), "certdata", "pycacert.pem") class SSLConnection(asyncore.dispatcher): """An asyncore.dispatcher subclass supporting TLS/SSL.""" diff --git a/Lib/test/test_httplib.py b/Lib/test/test_httplib.py index 47dbf08f3700d9..f6a9c820b54d31 100644 --- a/Lib/test/test_httplib.py +++ b/Lib/test/test_httplib.py @@ -23,11 +23,13 @@ here = os.path.dirname(__file__) # Self-signed cert file for 'localhost' -CERT_localhost = os.path.join(here, 'keycert.pem') +CERT_localhost = os.path.join(here, 'certdata', 'keycert.pem') # Self-signed cert file for 'fakehostname' -CERT_fakehostname = os.path.join(here, 'keycert2.pem') +CERT_fakehostname = os.path.join(here, 'certdata', 'keycert2.pem') # Self-signed cert file for self-signed.pythontest.net -CERT_selfsigned_pythontestdotnet = os.path.join(here, 'selfsigned_pythontestdotnet.pem') +CERT_selfsigned_pythontestdotnet = os.path.join( + here, 'certdata', 'selfsigned_pythontestdotnet.pem', +) # constants for testing chunked encoding chunked_start = ( diff --git a/Lib/test/test_imaplib.py b/Lib/test/test_imaplib.py index f097ba68154f25..bd0fc9c2da1c23 100644 --- a/Lib/test/test_imaplib.py +++ b/Lib/test/test_imaplib.py @@ -26,8 +26,8 @@ support.requires_working_socket(module=True) -CERTFILE = os.path.join(os.path.dirname(__file__) or os.curdir, "keycert3.pem") -CAFILE = os.path.join(os.path.dirname(__file__) or os.curdir, "pycacert.pem") +CERTFILE = os.path.join(os.path.dirname(__file__) or os.curdir, "certdata", "keycert3.pem") +CAFILE = os.path.join(os.path.dirname(__file__) or os.curdir, "certdata", "pycacert.pem") class TestImaplib(unittest.TestCase): diff --git a/Lib/test/test_logging.py b/Lib/test/test_logging.py index 55c5cd565814e8..ccf479d8e7e1c2 100644 --- a/Lib/test/test_logging.py +++ b/Lib/test/test_logging.py @@ -2075,7 +2075,7 @@ def test_output(self): sslctx = None else: here = os.path.dirname(__file__) - localhost_cert = os.path.join(here, "keycert.pem") + localhost_cert = os.path.join(here, "certdata", "keycert.pem") sslctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER) sslctx.load_cert_chain(localhost_cert) diff --git a/Lib/test/test_nntplib.py b/Lib/test/test_nntplib.py index 31a02f86abb003..30ae557978308c 100644 --- a/Lib/test/test_nntplib.py +++ b/Lib/test/test_nntplib.py @@ -20,7 +20,7 @@ ssl = None -certfile = os.path.join(os.path.dirname(__file__), 'keycert3.pem') +certfile = os.path.join(os.path.dirname(__file__), 'certdata', 'keycert3.pem') if ssl is not None: SSLError = ssl.SSLError diff --git a/Lib/test/test_poplib.py b/Lib/test/test_poplib.py index 5ad9202433dcfb..49ba9931974ec6 100644 --- a/Lib/test/test_poplib.py +++ b/Lib/test/test_poplib.py @@ -32,8 +32,8 @@ import ssl SUPPORTS_SSL = True - CERTFILE = os.path.join(os.path.dirname(__file__) or os.curdir, "keycert3.pem") - CAFILE = os.path.join(os.path.dirname(__file__) or os.curdir, "pycacert.pem") + CERTFILE = os.path.join(os.path.dirname(__file__) or os.curdir, "certdata", "keycert3.pem") + CAFILE = os.path.join(os.path.dirname(__file__) or os.curdir, "certdata", "pycacert.pem") requires_ssl = skipUnless(SUPPORTS_SSL, 'SSL not supported') diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py index 1bfec237484124..c98f7679a9f06e 100644 --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -63,10 +63,10 @@ PROTOCOL_TO_TLS_VERSION[proto] = ver def data_file(*name): - return os.path.join(os.path.dirname(__file__), *name) + return os.path.join(os.path.dirname(__file__), "certdata", *name) # The custom key and certificate files used in test_ssl are generated -# using Lib/test/make_ssl_certs.py. +# using Lib/test/certdata/make_ssl_certs.py. # Other certificates are simply fetched from the internet servers they # are meant to authenticate. @@ -675,7 +675,7 @@ def test_errors_sslwrap(self): def bad_cert_test(self, certfile): """Check that trying to use the given client certificate fails""" certfile = os.path.join(os.path.dirname(__file__) or os.curdir, - certfile) + "certdata", certfile) sock = socket.socket() self.addCleanup(sock.close) with self.assertRaises(ssl.SSLError): @@ -3557,12 +3557,12 @@ def test_socketserver(self): # try to connect if support.verbose: sys.stdout.write('\n') - with open(CERTFILE, 'rb') as f: + # Get this test file itself: + with open(__file__, 'rb') as f: d1 = f.read() d2 = '' # now fetch the same data from the HTTPS server - url = 'https://localhost:%d/%s' % ( - server.port, os.path.split(CERTFILE)[1]) + url = f'https://localhost:{server.port}/test_ssl.py' context = ssl.create_default_context(cafile=SIGNING_CA) f = urllib.request.urlopen(url, context=context) try: diff --git a/Lib/test/test_urllib2_localnet.py b/Lib/test/test_urllib2_localnet.py index f4729358557c95..96e43970d49fb9 100644 --- a/Lib/test/test_urllib2_localnet.py +++ b/Lib/test/test_urllib2_localnet.py @@ -22,9 +22,9 @@ here = os.path.dirname(__file__) # Self-signed cert file for 'localhost' -CERT_localhost = os.path.join(here, 'keycert.pem') +CERT_localhost = os.path.join(here, 'certdata', 'keycert.pem') # Self-signed cert file for 'fakehostname' -CERT_fakehostname = os.path.join(here, 'keycert2.pem') +CERT_fakehostname = os.path.join(here, 'certdata', 'keycert2.pem') # Loopback http server infrastructure diff --git a/Makefile.pre.in b/Makefile.pre.in index 1dd8189167aa21..49238902718eb8 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -1947,7 +1947,8 @@ TESTSUBDIRS= ctypes/test \ lib2to3/tests/data/fixers/myfixes \ test \ test/audiodata \ - test/capath \ + test/certdata \ + test/certdata/capath \ test/cjkencodings \ test/crashers \ test/data \ From a009bb872c57ad3317caaf4caad5327d0ff664e2 Mon Sep 17 00:00:00 2001 From: "Erlend E. Aasland" Date: Wed, 11 Oct 2023 00:41:12 +0200 Subject: [PATCH 056/265] [3.11] gh-109286: Update Windows installer to use SQLite 3.43.1 (#110403) (#110479) --- .../next/Windows/2023-10-05-15-23-23.gh-issue-109286.N8OzMg.rst | 1 + PCbuild/get_externals.bat | 2 +- PCbuild/python.props | 2 +- PCbuild/readme.txt | 2 +- 4 files changed, 4 insertions(+), 3 deletions(-) create mode 100644 Misc/NEWS.d/next/Windows/2023-10-05-15-23-23.gh-issue-109286.N8OzMg.rst diff --git a/Misc/NEWS.d/next/Windows/2023-10-05-15-23-23.gh-issue-109286.N8OzMg.rst b/Misc/NEWS.d/next/Windows/2023-10-05-15-23-23.gh-issue-109286.N8OzMg.rst new file mode 100644 index 00000000000000..14a2aff70803ce --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2023-10-05-15-23-23.gh-issue-109286.N8OzMg.rst @@ -0,0 +1 @@ +Update Windows installer to use SQLite 3.43.1. diff --git a/PCbuild/get_externals.bat b/PCbuild/get_externals.bat index 60dc725bb6a2fb..5e2781e9757c4d 100644 --- a/PCbuild/get_externals.bat +++ b/PCbuild/get_externals.bat @@ -54,7 +54,7 @@ set libraries= set libraries=%libraries% bzip2-1.0.8 if NOT "%IncludeLibffiSrc%"=="false" set libraries=%libraries% libffi-3.4.4 if NOT "%IncludeSSLSrc%"=="false" set libraries=%libraries% openssl-3.0.11 -set libraries=%libraries% sqlite-3.42.0.0 +set libraries=%libraries% sqlite-3.43.1.0 if NOT "%IncludeTkinterSrc%"=="false" set libraries=%libraries% tcl-core-8.6.12.1 if NOT "%IncludeTkinterSrc%"=="false" set libraries=%libraries% tk-8.6.12.1 if NOT "%IncludeTkinterSrc%"=="false" set libraries=%libraries% tix-8.4.3.6 diff --git a/PCbuild/python.props b/PCbuild/python.props index 35c6e92be45dc9..496bc3dd4cf794 100644 --- a/PCbuild/python.props +++ b/PCbuild/python.props @@ -68,7 +68,7 @@ - $(ExternalsDir)sqlite-3.42.0.0\ + $(ExternalsDir)sqlite-3.43.1.0\ $(ExternalsDir)bzip2-1.0.8\ $(ExternalsDir)xz-5.2.5\ $(ExternalsDir)libffi-3.4.4\ diff --git a/PCbuild/readme.txt b/PCbuild/readme.txt index d6149a3cedeaa3..6a2c7a95b83718 100644 --- a/PCbuild/readme.txt +++ b/PCbuild/readme.txt @@ -187,7 +187,7 @@ _ssl again when building. _sqlite3 - Wraps SQLite 3.42.0, which is itself built by sqlite3.vcxproj + Wraps SQLite 3.43.1, which is itself built by sqlite3.vcxproj Homepage: https://www.sqlite.org/ _tkinter From 46347d3caf92e773f99b301ab4de7d13c99e04e3 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 11 Oct 2023 02:21:51 +0200 Subject: [PATCH 057/265] [3.11] gh-110647: Fix signal test_stress_modifying_handlers() (GH-110650) (#110659) gh-110647: Fix signal test_stress_modifying_handlers() (GH-110650) * cycle_handlers() now waits until at least one signal is received. * num_received_signals can be equal to num_sent_signals. (cherry picked from commit e07c37cd5212c9d13749b4d02a1d68e1efcba6cf) Co-authored-by: Victor Stinner --- Lib/test/test_signal.py | 4 ++-- .../next/Tests/2023-10-10-23-20-13.gh-issue-110647.jKG3sY.rst | 2 ++ 2 files changed, 4 insertions(+), 2 deletions(-) create mode 100644 Misc/NEWS.d/next/Tests/2023-10-10-23-20-13.gh-issue-110647.jKG3sY.rst diff --git a/Lib/test/test_signal.py b/Lib/test/test_signal.py index a00c7e539400d9..0f3343b10e4a07 100644 --- a/Lib/test/test_signal.py +++ b/Lib/test/test_signal.py @@ -1347,7 +1347,7 @@ def set_interrupts(): num_sent_signals += 1 def cycle_handlers(): - while num_sent_signals < 100: + while num_sent_signals < 100 or num_received_signals < 1: for i in range(20000): # Cycle between a Python-defined and a non-Python handler for handler in [custom_handler, signal.SIG_IGN]: @@ -1380,7 +1380,7 @@ def cycle_handlers(): if not ignored: # Sanity check that some signals were received, but not all self.assertGreater(num_received_signals, 0) - self.assertLess(num_received_signals, num_sent_signals) + self.assertLessEqual(num_received_signals, num_sent_signals) finally: do_stop = True t.join() diff --git a/Misc/NEWS.d/next/Tests/2023-10-10-23-20-13.gh-issue-110647.jKG3sY.rst b/Misc/NEWS.d/next/Tests/2023-10-10-23-20-13.gh-issue-110647.jKG3sY.rst new file mode 100644 index 00000000000000..00f38c844755ec --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-10-10-23-20-13.gh-issue-110647.jKG3sY.rst @@ -0,0 +1,2 @@ +Fix test_stress_modifying_handlers() of test_signal. Patch by Victor +Stinner. From 7984dc28b3c3163b0e58ba326417cd3e4bc3dbaa Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 11 Oct 2023 03:22:51 +0200 Subject: [PATCH 058/265] [3.11] gh-110656: Fix logging test_post_fork_child_no_deadlock() if ASAN (GH-110657) (#110665) gh-110656: Fix logging test_post_fork_child_no_deadlock() if ASAN (GH-110657) Skip test_post_fork_child_no_deadlock() if Python is built with ASAN. Add support.HAVE_ASAN_FORK_BUG. (cherry picked from commit f901f56313610389027cb4eae80d1d4b071aef69) Co-authored-by: Victor Stinner --- Lib/test/_test_multiprocessing.py | 4 ++-- Lib/test/support/__init__.py | 4 ++++ Lib/test/test_logging.py | 8 ++++++++ Lib/test/test_threading.py | 9 +-------- 4 files changed, 15 insertions(+), 10 deletions(-) diff --git a/Lib/test/_test_multiprocessing.py b/Lib/test/_test_multiprocessing.py index 42441fed79bf3e..834bc6b87fb8d2 100644 --- a/Lib/test/_test_multiprocessing.py +++ b/Lib/test/_test_multiprocessing.py @@ -77,10 +77,10 @@ msvcrt = None -if support.check_sanitizer(address=True): +if support.HAVE_ASAN_FORK_BUG: # gh-89363: Skip multiprocessing tests if Python is built with ASAN to # work around a libasan race condition: dead lock in pthread_create(). - raise unittest.SkipTest("libasan has a pthread_create() dead lock") + raise unittest.SkipTest("libasan has a pthread_create() dead lock related to thread+fork") def latin(s): diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index 1d3b0053bf86c9..f5307b6442bf66 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -426,6 +426,10 @@ def skip_if_sanitizer(reason=None, *, address=False, memory=False, ub=False): skip = check_sanitizer(address=address, memory=memory, ub=ub) return unittest.skipIf(skip, reason) +# gh-89363: True if fork() can hang if Python is built with Address Sanitizer +# (ASAN): libasan race condition, dead lock in pthread_create(). +HAVE_ASAN_FORK_BUG = check_sanitizer(address=True) + def system_must_validate_cert(f): """Skip the test on TLS certificate validation failures.""" diff --git a/Lib/test/test_logging.py b/Lib/test/test_logging.py index ccf479d8e7e1c2..e097acd85eb72b 100644 --- a/Lib/test/test_logging.py +++ b/Lib/test/test_logging.py @@ -76,6 +76,13 @@ pass +# gh-89363: Skip fork() test if Python is built with Address Sanitizer (ASAN) +# to work around a libasan race condition, dead lock in pthread_create(). +skip_if_asan_fork = unittest.skipIf( + support.HAVE_ASAN_FORK_BUG, + "libasan has a pthread_create() dead lock related to thread+fork") + + class BaseTest(unittest.TestCase): """Base class for logging tests.""" @@ -682,6 +689,7 @@ def remove_loop(fname, tries): # register_at_fork mechanism is also present and used. @support.requires_fork() @threading_helper.requires_working_threading() + @skip_if_asan_fork def test_post_fork_child_no_deadlock(self): """Ensure child logging locks are not held; bpo-6721 & bpo-36533.""" class _OurHandler(logging.Handler): diff --git a/Lib/test/test_threading.py b/Lib/test/test_threading.py index a4192acd6dbc5c..17319c366a483b 100644 --- a/Lib/test/test_threading.py +++ b/Lib/test/test_threading.py @@ -37,19 +37,12 @@ Py_DEBUG = hasattr(sys, 'gettotalrefcount') -# gh-89363: Skip fork() test if Python is built with Address Sanitizer (ASAN) -# to work around a libasan race condition, dead lock in pthread_create(). -skip_if_asan_fork = support.skip_if_sanitizer( - "libasan has a pthread_create() dead lock", - address=True) - - def skip_unless_reliable_fork(test): if not support.has_fork_support: return unittest.skip("requires working os.fork()")(test) if sys.platform in platforms_to_skip: return unittest.skip("due to known OS bug related to thread+fork")(test) - if support.check_sanitizer(address=True): + if support.HAVE_ASAN_FORK_BUG: return unittest.skip("libasan has a pthread_create() dead lock related to thread+fork")(test) return test From 8ca7a230e7f6734987478a4c4db2efded594fb84 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 11 Oct 2023 04:09:53 +0200 Subject: [PATCH 059/265] [3.11] gh-110666: Fix multiprocessing test_terminate() elapsed (GH-110667) (#110669) gh-110666: Fix multiprocessing test_terminate() elapsed (GH-110667) multiprocessing test_terminate() and test_wait_socket_slow() no longer test the CI performance: no longer check maximum elapsed time. Add CLOCK_RES constant: tolerate a difference of 100 ms. (cherry picked from commit 1556f426da3f2fb5842689999933c8038b65c034) Co-authored-by: Victor Stinner --- Lib/test/_test_multiprocessing.py | 35 +++++++++++++------------------ 1 file changed, 15 insertions(+), 20 deletions(-) diff --git a/Lib/test/_test_multiprocessing.py b/Lib/test/_test_multiprocessing.py index 834bc6b87fb8d2..84bca5b3b259fa 100644 --- a/Lib/test/_test_multiprocessing.py +++ b/Lib/test/_test_multiprocessing.py @@ -83,6 +83,11 @@ raise unittest.SkipTest("libasan has a pthread_create() dead lock related to thread+fork") +# gh-110666: Tolerate a difference of 100 ms when comparing timings +# (clock resolution) +CLOCK_RES = 0.100 + + def latin(s): return s.encode('latin') @@ -1654,8 +1659,7 @@ def _test_waitfor_timeout_f(cls, cond, state, success, sem): dt = time.monotonic() result = cond.wait_for(lambda : state.value==4, timeout=expected) dt = time.monotonic() - dt - # borrow logic in assertTimeout() from test/lock_tests.py - if not result and expected * 0.6 <= dt: + if not result and (expected - CLOCK_RES) <= dt: success.value = True @unittest.skipUnless(HAS_SHAREDCTYPES, 'needs sharedctypes') @@ -2677,14 +2681,11 @@ def test_make_pool(self): p.join() def test_terminate(self): - result = self.pool.map_async( - time.sleep, [0.1 for i in range(10000)], chunksize=1 - ) + # Simulate slow tasks which take "forever" to complete + args = [support.LONG_TIMEOUT for i in range(10_000)] + result = self.pool.map_async(time.sleep, args, chunksize=1) self.pool.terminate() - join = TimingWrapper(self.pool.join) - join() - # Sanity check the pool didn't wait for all tasks to finish - self.assertLess(join.elapsed, 2.0) + self.pool.join() def test_empty_iterable(self): # See Issue 12157 @@ -4831,7 +4832,7 @@ class TestWait(unittest.TestCase): def _child_test_wait(cls, w, slow): for i in range(10): if slow: - time.sleep(random.random()*0.1) + time.sleep(random.random() * 0.100) w.send((i, os.getpid())) w.close() @@ -4871,7 +4872,7 @@ def _child_test_wait_socket(cls, address, slow): s.connect(address) for i in range(10): if slow: - time.sleep(random.random()*0.1) + time.sleep(random.random() * 0.100) s.sendall(('%s\n' % i).encode('ascii')) s.close() @@ -4920,25 +4921,19 @@ def test_wait_socket_slow(self): def test_wait_timeout(self): from multiprocessing.connection import wait - expected = 5 + timeout = 5.0 # seconds a, b = multiprocessing.Pipe() start = time.monotonic() - res = wait([a, b], expected) + res = wait([a, b], timeout) delta = time.monotonic() - start self.assertEqual(res, []) - self.assertLess(delta, expected * 2) - self.assertGreater(delta, expected * 0.5) + self.assertGreater(delta, timeout - CLOCK_RES) b.send(None) - - start = time.monotonic() res = wait([a, b], 20) - delta = time.monotonic() - start - self.assertEqual(res, [a]) - self.assertLess(delta, 0.4) @classmethod def signal_and_sleep(cls, sem, period): From c6d5628be950bdf2c31243b4cc0d9e0b658458dd Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 11 Oct 2023 05:07:03 +0200 Subject: [PATCH 060/265] [3.11] gh-110662: multiprocessing test_async_timeout() increase timeout (GH-110663) (#110675) gh-110662: multiprocessing test_async_timeout() increase timeout (GH-110663) Increase timeout from 1 second to 30 seconds, if not longer. The important part is that apply_async() takes longer than TIMEOUT2. (cherry picked from commit 790ecf6302e47b84da5d1c3b14dbdf070bce615b) Co-authored-by: Victor Stinner --- Lib/test/_test_multiprocessing.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Lib/test/_test_multiprocessing.py b/Lib/test/_test_multiprocessing.py index 84bca5b3b259fa..3bcb1d1d46991f 100644 --- a/Lib/test/_test_multiprocessing.py +++ b/Lib/test/_test_multiprocessing.py @@ -2577,7 +2577,7 @@ def test_async(self): self.assertTimingAlmostEqual(get.elapsed, TIMEOUT1) def test_async_timeout(self): - res = self.pool.apply_async(sqr, (6, TIMEOUT2 + 1.0)) + res = self.pool.apply_async(sqr, (6, TIMEOUT2 + support.SHORT_TIMEOUT)) get = TimingWrapper(res.get) self.assertRaises(multiprocessing.TimeoutError, get, timeout=TIMEOUT2) self.assertTimingAlmostEqual(get.elapsed, TIMEOUT2) From a8b2d12a258a72963beb86406c69eb928863a09b Mon Sep 17 00:00:00 2001 From: Ezio Melotti Date: Wed, 11 Oct 2023 11:53:26 +0200 Subject: [PATCH 061/265] [3.11] gh-110631: fix wrong indentation in the `Doc/whatsnew` dir (GH-110632) (#110691) fix wrong indentation in the `Doc/whatsnew` dir (#110632) --- Doc/whatsnew/2.6.rst | 20 ++--- Doc/whatsnew/2.7.rst | 2 +- Doc/whatsnew/3.10.rst | 200 +++++++++++++++++++++--------------------- Doc/whatsnew/3.3.rst | 128 +++++++++++++-------------- Doc/whatsnew/3.7.rst | 14 +-- 5 files changed, 182 insertions(+), 182 deletions(-) diff --git a/Doc/whatsnew/2.6.rst b/Doc/whatsnew/2.6.rst index 96d9b792b3723a..759a0757ebefa7 100644 --- a/Doc/whatsnew/2.6.rst +++ b/Doc/whatsnew/2.6.rst @@ -875,11 +875,11 @@ The signature of the new function is:: The parameters are: - * *args*: positional arguments whose values will be printed out. - * *sep*: the separator, which will be printed between arguments. - * *end*: the ending text, which will be printed after all of the - arguments have been output. - * *file*: the file object to which the output will be sent. +* *args*: positional arguments whose values will be printed out. +* *sep*: the separator, which will be printed between arguments. +* *end*: the ending text, which will be printed after all of the + arguments have been output. +* *file*: the file object to which the output will be sent. .. seealso:: @@ -1138,13 +1138,13 @@ indicate that the external caller is done. The *flags* argument to :c:func:`PyObject_GetBuffer` specifies constraints upon the memory returned. Some examples are: - * :c:macro:`PyBUF_WRITABLE` indicates that the memory must be writable. +* :c:macro:`PyBUF_WRITABLE` indicates that the memory must be writable. - * :c:macro:`PyBUF_LOCK` requests a read-only or exclusive lock on the memory. +* :c:macro:`PyBUF_LOCK` requests a read-only or exclusive lock on the memory. - * :c:macro:`PyBUF_C_CONTIGUOUS` and :c:macro:`PyBUF_F_CONTIGUOUS` - requests a C-contiguous (last dimension varies the fastest) or - Fortran-contiguous (first dimension varies the fastest) array layout. +* :c:macro:`PyBUF_C_CONTIGUOUS` and :c:macro:`PyBUF_F_CONTIGUOUS` + requests a C-contiguous (last dimension varies the fastest) or + Fortran-contiguous (first dimension varies the fastest) array layout. Two new argument codes for :c:func:`PyArg_ParseTuple`, ``s*`` and ``z*``, return locked buffer objects for a parameter. diff --git a/Doc/whatsnew/2.7.rst b/Doc/whatsnew/2.7.rst index 787fe8468b50d1..caf28ff50b3aa2 100644 --- a/Doc/whatsnew/2.7.rst +++ b/Doc/whatsnew/2.7.rst @@ -2367,7 +2367,7 @@ Port-Specific Changes: Mac OS X installation and a user-installed copy of the same version. (Changed by Ronald Oussoren; :issue:`4865`.) - .. versionchanged:: 2.7.13 + .. versionchanged:: 2.7.13 As of 2.7.13, this change was removed. ``/Library/Python/2.7/site-packages``, the site-packages directory diff --git a/Doc/whatsnew/3.10.rst b/Doc/whatsnew/3.10.rst index 572bb4b07e5030..144a4940794d81 100644 --- a/Doc/whatsnew/3.10.rst +++ b/Doc/whatsnew/3.10.rst @@ -221,116 +221,116 @@ have been incorporated. Some of the most notable ones are as follows: * Missing ``:`` before blocks: - .. code-block:: python + .. code-block:: python - >>> if rocket.position > event_horizon - File "", line 1 - if rocket.position > event_horizon - ^ - SyntaxError: expected ':' + >>> if rocket.position > event_horizon + File "", line 1 + if rocket.position > event_horizon + ^ + SyntaxError: expected ':' - (Contributed by Pablo Galindo in :issue:`42997`.) + (Contributed by Pablo Galindo in :issue:`42997`.) * Unparenthesised tuples in comprehensions targets: - .. code-block:: python + .. code-block:: python - >>> {x,y for x,y in zip('abcd', '1234')} - File "", line 1 - {x,y for x,y in zip('abcd', '1234')} - ^ - SyntaxError: did you forget parentheses around the comprehension target? + >>> {x,y for x,y in zip('abcd', '1234')} + File "", line 1 + {x,y for x,y in zip('abcd', '1234')} + ^ + SyntaxError: did you forget parentheses around the comprehension target? - (Contributed by Pablo Galindo in :issue:`43017`.) + (Contributed by Pablo Galindo in :issue:`43017`.) * Missing commas in collection literals and between expressions: - .. code-block:: python + .. code-block:: python - >>> items = { - ... x: 1, - ... y: 2 - ... z: 3, - File "", line 3 - y: 2 - ^ - SyntaxError: invalid syntax. Perhaps you forgot a comma? + >>> items = { + ... x: 1, + ... y: 2 + ... z: 3, + File "", line 3 + y: 2 + ^ + SyntaxError: invalid syntax. Perhaps you forgot a comma? - (Contributed by Pablo Galindo in :issue:`43822`.) + (Contributed by Pablo Galindo in :issue:`43822`.) * Multiple Exception types without parentheses: - .. code-block:: python + .. code-block:: python - >>> try: - ... build_dyson_sphere() - ... except NotEnoughScienceError, NotEnoughResourcesError: - File "", line 3 - except NotEnoughScienceError, NotEnoughResourcesError: - ^ - SyntaxError: multiple exception types must be parenthesized + >>> try: + ... build_dyson_sphere() + ... except NotEnoughScienceError, NotEnoughResourcesError: + File "", line 3 + except NotEnoughScienceError, NotEnoughResourcesError: + ^ + SyntaxError: multiple exception types must be parenthesized - (Contributed by Pablo Galindo in :issue:`43149`.) + (Contributed by Pablo Galindo in :issue:`43149`.) * Missing ``:`` and values in dictionary literals: - .. code-block:: python + .. code-block:: python - >>> values = { - ... x: 1, - ... y: 2, - ... z: - ... } - File "", line 4 - z: - ^ - SyntaxError: expression expected after dictionary key and ':' + >>> values = { + ... x: 1, + ... y: 2, + ... z: + ... } + File "", line 4 + z: + ^ + SyntaxError: expression expected after dictionary key and ':' - >>> values = {x:1, y:2, z w:3} - File "", line 1 - values = {x:1, y:2, z w:3} - ^ - SyntaxError: ':' expected after dictionary key + >>> values = {x:1, y:2, z w:3} + File "", line 1 + values = {x:1, y:2, z w:3} + ^ + SyntaxError: ':' expected after dictionary key - (Contributed by Pablo Galindo in :issue:`43823`.) + (Contributed by Pablo Galindo in :issue:`43823`.) * ``try`` blocks without ``except`` or ``finally`` blocks: - .. code-block:: python + .. code-block:: python - >>> try: - ... x = 2 - ... something = 3 - File "", line 3 - something = 3 - ^^^^^^^^^ - SyntaxError: expected 'except' or 'finally' block + >>> try: + ... x = 2 + ... something = 3 + File "", line 3 + something = 3 + ^^^^^^^^^ + SyntaxError: expected 'except' or 'finally' block - (Contributed by Pablo Galindo in :issue:`44305`.) + (Contributed by Pablo Galindo in :issue:`44305`.) * Usage of ``=`` instead of ``==`` in comparisons: - .. code-block:: python + .. code-block:: python - >>> if rocket.position = event_horizon: - File "", line 1 - if rocket.position = event_horizon: - ^ - SyntaxError: cannot assign to attribute here. Maybe you meant '==' instead of '='? + >>> if rocket.position = event_horizon: + File "", line 1 + if rocket.position = event_horizon: + ^ + SyntaxError: cannot assign to attribute here. Maybe you meant '==' instead of '='? - (Contributed by Pablo Galindo in :issue:`43797`.) + (Contributed by Pablo Galindo in :issue:`43797`.) * Usage of ``*`` in f-strings: - .. code-block:: python + .. code-block:: python - >>> f"Black holes {*all_black_holes} and revelations" - File "", line 1 - (*all_black_holes) - ^ - SyntaxError: f-string: cannot use starred expression here + >>> f"Black holes {*all_black_holes} and revelations" + File "", line 1 + (*all_black_holes) + ^ + SyntaxError: f-string: cannot use starred expression here - (Contributed by Pablo Galindo in :issue:`41064`.) + (Contributed by Pablo Galindo in :issue:`41064`.) IndentationErrors ~~~~~~~~~~~~~~~~~ @@ -365,10 +365,10 @@ raised from: (Contributed by Pablo Galindo in :issue:`38530`.) - .. warning:: - Notice this won't work if :c:func:`PyErr_Display` is not called to display the error - which can happen if some other custom error display function is used. This is a common - scenario in some REPLs like IPython. +.. warning:: + Notice this won't work if :c:func:`PyErr_Display` is not called to display the error + which can happen if some other custom error display function is used. This is a common + scenario in some REPLs like IPython. NameErrors ~~~~~~~~~~ @@ -387,10 +387,10 @@ was raised from: (Contributed by Pablo Galindo in :issue:`38530`.) - .. warning:: - Notice this won't work if :c:func:`PyErr_Display` is not called to display the error, - which can happen if some other custom error display function is used. This is a common - scenario in some REPLs like IPython. +.. warning:: + Notice this won't work if :c:func:`PyErr_Display` is not called to display the error, + which can happen if some other custom error display function is used. This is a common + scenario in some REPLs like IPython. PEP 626: Precise line numbers for debugging and other tools @@ -433,16 +433,16 @@ A match statement takes an expression and compares its value to successive patterns given as one or more case blocks. Specifically, pattern matching operates by: - 1. using data with type and shape (the ``subject``) - 2. evaluating the ``subject`` in the ``match`` statement - 3. comparing the subject with each pattern in a ``case`` statement - from top to bottom until a match is confirmed. - 4. executing the action associated with the pattern of the confirmed - match - 5. If an exact match is not confirmed, the last case, a wildcard ``_``, - if provided, will be used as the matching case. If an exact match is - not confirmed and a wildcard case does not exist, the entire match - block is a no-op. +1. using data with type and shape (the ``subject``) +2. evaluating the ``subject`` in the ``match`` statement +3. comparing the subject with each pattern in a ``case`` statement + from top to bottom until a match is confirmed. +4. executing the action associated with the pattern of the confirmed + match +5. If an exact match is not confirmed, the last case, a wildcard ``_``, + if provided, will be used as the matching case. If an exact match is + not confirmed and a wildcard case does not exist, the entire match + block is a no-op. Declarative approach ~~~~~~~~~~~~~~~~~~~~ @@ -2210,16 +2210,16 @@ Removed * Removed ``Py_UNICODE_str*`` functions manipulating ``Py_UNICODE*`` strings. (Contributed by Inada Naoki in :issue:`41123`.) - * ``Py_UNICODE_strlen``: use :c:func:`PyUnicode_GetLength` or - :c:macro:`PyUnicode_GET_LENGTH` - * ``Py_UNICODE_strcat``: use :c:func:`PyUnicode_CopyCharacters` or - :c:func:`PyUnicode_FromFormat` - * ``Py_UNICODE_strcpy``, ``Py_UNICODE_strncpy``: use - :c:func:`PyUnicode_CopyCharacters` or :c:func:`PyUnicode_Substring` - * ``Py_UNICODE_strcmp``: use :c:func:`PyUnicode_Compare` - * ``Py_UNICODE_strncmp``: use :c:func:`PyUnicode_Tailmatch` - * ``Py_UNICODE_strchr``, ``Py_UNICODE_strrchr``: use - :c:func:`PyUnicode_FindChar` + * ``Py_UNICODE_strlen``: use :c:func:`PyUnicode_GetLength` or + :c:macro:`PyUnicode_GET_LENGTH` + * ``Py_UNICODE_strcat``: use :c:func:`PyUnicode_CopyCharacters` or + :c:func:`PyUnicode_FromFormat` + * ``Py_UNICODE_strcpy``, ``Py_UNICODE_strncpy``: use + :c:func:`PyUnicode_CopyCharacters` or :c:func:`PyUnicode_Substring` + * ``Py_UNICODE_strcmp``: use :c:func:`PyUnicode_Compare` + * ``Py_UNICODE_strncmp``: use :c:func:`PyUnicode_Tailmatch` + * ``Py_UNICODE_strchr``, ``Py_UNICODE_strrchr``: use + :c:func:`PyUnicode_FindChar` * Removed ``PyUnicode_GetMax()``. Please migrate to new (:pep:`393`) APIs. (Contributed by Inada Naoki in :issue:`41103`.) diff --git a/Doc/whatsnew/3.3.rst b/Doc/whatsnew/3.3.rst index becff2f27a309e..3bc3d2f7c3740e 100644 --- a/Doc/whatsnew/3.3.rst +++ b/Doc/whatsnew/3.3.rst @@ -916,12 +916,12 @@ abstract methods. The recommended approach to declaring abstract descriptors is now to provide :attr:`__isabstractmethod__` as a dynamically updated property. The built-in descriptors have been updated accordingly. - * :class:`abc.abstractproperty` has been deprecated, use :class:`property` - with :func:`abc.abstractmethod` instead. - * :class:`abc.abstractclassmethod` has been deprecated, use - :class:`classmethod` with :func:`abc.abstractmethod` instead. - * :class:`abc.abstractstaticmethod` has been deprecated, use - :class:`staticmethod` with :func:`abc.abstractmethod` instead. +* :class:`abc.abstractproperty` has been deprecated, use :class:`property` + with :func:`abc.abstractmethod` instead. +* :class:`abc.abstractclassmethod` has been deprecated, use + :class:`classmethod` with :func:`abc.abstractmethod` instead. +* :class:`abc.abstractstaticmethod` has been deprecated, use + :class:`staticmethod` with :func:`abc.abstractmethod` instead. (Contributed by Darren Dale in :issue:`11610`.) @@ -1059,32 +1059,32 @@ function to the :mod:`crypt` module. curses ------ - * If the :mod:`curses` module is linked to the ncursesw library, use Unicode - functions when Unicode strings or characters are passed (e.g. - :c:func:`waddwstr`), and bytes functions otherwise (e.g. :c:func:`waddstr`). - * Use the locale encoding instead of ``utf-8`` to encode Unicode strings. - * :class:`curses.window` has a new :attr:`curses.window.encoding` attribute. - * The :class:`curses.window` class has a new :meth:`~curses.window.get_wch` - method to get a wide character - * The :mod:`curses` module has a new :meth:`~curses.unget_wch` function to - push a wide character so the next :meth:`~curses.window.get_wch` will return - it +* If the :mod:`curses` module is linked to the ncursesw library, use Unicode + functions when Unicode strings or characters are passed (e.g. + :c:func:`waddwstr`), and bytes functions otherwise (e.g. :c:func:`waddstr`). +* Use the locale encoding instead of ``utf-8`` to encode Unicode strings. +* :class:`curses.window` has a new :attr:`curses.window.encoding` attribute. +* The :class:`curses.window` class has a new :meth:`~curses.window.get_wch` + method to get a wide character +* The :mod:`curses` module has a new :meth:`~curses.unget_wch` function to + push a wide character so the next :meth:`~curses.window.get_wch` will return + it (Contributed by Iñigo Serna in :issue:`6755`.) datetime -------- - * Equality comparisons between naive and aware :class:`~datetime.datetime` - instances now return :const:`False` instead of raising :exc:`TypeError` - (:issue:`15006`). - * New :meth:`datetime.datetime.timestamp` method: Return POSIX timestamp - corresponding to the :class:`~datetime.datetime` instance. - * The :meth:`datetime.datetime.strftime` method supports formatting years - older than 1000. - * The :meth:`datetime.datetime.astimezone` method can now be - called without arguments to convert datetime instance to the system - timezone. +* Equality comparisons between naive and aware :class:`~datetime.datetime` + instances now return :const:`False` instead of raising :exc:`TypeError` + (:issue:`15006`). +* New :meth:`datetime.datetime.timestamp` method: Return POSIX timestamp + corresponding to the :class:`~datetime.datetime` instance. +* The :meth:`datetime.datetime.strftime` method supports formatting years + older than 1000. +* The :meth:`datetime.datetime.astimezone` method can now be + called without arguments to convert datetime instance to the system + timezone. .. _new-decimal: @@ -1209,25 +1209,25 @@ the ``Message`` object it is serializing. The default policy is The minimum set of controls implemented by all ``policy`` objects are: - .. tabularcolumns:: |l|L| +.. tabularcolumns:: |l|L| - =============== ======================================================= - max_line_length The maximum length, excluding the linesep character(s), - individual lines may have when a ``Message`` is - serialized. Defaults to 78. +=============== ======================================================= +max_line_length The maximum length, excluding the linesep character(s), + individual lines may have when a ``Message`` is + serialized. Defaults to 78. - linesep The character used to separate individual lines when a - ``Message`` is serialized. Defaults to ``\n``. +linesep The character used to separate individual lines when a + ``Message`` is serialized. Defaults to ``\n``. - cte_type ``7bit`` or ``8bit``. ``8bit`` applies only to a - ``Bytes`` ``generator``, and means that non-ASCII may - be used where allowed by the protocol (or where it - exists in the original input). +cte_type ``7bit`` or ``8bit``. ``8bit`` applies only to a + ``Bytes`` ``generator``, and means that non-ASCII may + be used where allowed by the protocol (or where it + exists in the original input). - raise_on_defect Causes a ``parser`` to raise error when defects are - encountered instead of adding them to the ``Message`` - object's ``defects`` list. - =============== ======================================================= +raise_on_defect Causes a ``parser`` to raise error when defects are + encountered instead of adding them to the ``Message`` + object's ``defects`` list. +=============== ======================================================= A new policy instance, with new settings, is created using the :meth:`~email.policy.Policy.clone` method of policy objects. ``clone`` takes @@ -1262,21 +1262,21 @@ removal of the code) may occur if deemed necessary by the core developers. The new policies are instances of :class:`~email.policy.EmailPolicy`, and add the following additional controls: - .. tabularcolumns:: |l|L| +.. tabularcolumns:: |l|L| - =============== ======================================================= - refold_source Controls whether or not headers parsed by a - :mod:`~email.parser` are refolded by the - :mod:`~email.generator`. It can be ``none``, ``long``, - or ``all``. The default is ``long``, which means that - source headers with a line longer than - ``max_line_length`` get refolded. ``none`` means no - line get refolded, and ``all`` means that all lines - get refolded. +=============== ======================================================= +refold_source Controls whether or not headers parsed by a + :mod:`~email.parser` are refolded by the + :mod:`~email.generator`. It can be ``none``, ``long``, + or ``all``. The default is ``long``, which means that + source headers with a line longer than + ``max_line_length`` get refolded. ``none`` means no + line get refolded, and ``all`` means that all lines + get refolded. - header_factory A callable that take a ``name`` and ``value`` and - produces a custom header object. - =============== ======================================================= +header_factory A callable that take a ``name`` and ``value`` and + produces a custom header object. +=============== ======================================================= The ``header_factory`` is the key to the new features provided by the new policies. When one of the new policies is used, any header retrieved from @@ -1351,18 +1351,18 @@ API. New utility functions: - * :func:`~email.utils.format_datetime`: given a :class:`~datetime.datetime`, - produce a string formatted for use in an email header. +* :func:`~email.utils.format_datetime`: given a :class:`~datetime.datetime`, + produce a string formatted for use in an email header. - * :func:`~email.utils.parsedate_to_datetime`: given a date string from - an email header, convert it into an aware :class:`~datetime.datetime`, - or a naive :class:`~datetime.datetime` if the offset is ``-0000``. +* :func:`~email.utils.parsedate_to_datetime`: given a date string from + an email header, convert it into an aware :class:`~datetime.datetime`, + or a naive :class:`~datetime.datetime` if the offset is ``-0000``. - * :func:`~email.utils.localtime`: With no argument, returns the - current local time as an aware :class:`~datetime.datetime` using the local - :class:`~datetime.timezone`. Given an aware :class:`~datetime.datetime`, - converts it into an aware :class:`~datetime.datetime` using the - local :class:`~datetime.timezone`. +* :func:`~email.utils.localtime`: With no argument, returns the + current local time as an aware :class:`~datetime.datetime` using the local + :class:`~datetime.timezone`. Given an aware :class:`~datetime.datetime`, + converts it into an aware :class:`~datetime.datetime` using the + local :class:`~datetime.timezone`. ftplib diff --git a/Doc/whatsnew/3.7.rst b/Doc/whatsnew/3.7.rst index a3fd6321f3124e..48d712623e05dd 100644 --- a/Doc/whatsnew/3.7.rst +++ b/Doc/whatsnew/3.7.rst @@ -1580,13 +1580,13 @@ The initialization of the default warnings filters has changed as follows: * warnings filters enabled via the command line or the environment now have the following order of precedence: - * the ``BytesWarning`` filter for :option:`-b` (or ``-bb``) - * any filters specified with the :option:`-W` option - * any filters specified with the :envvar:`PYTHONWARNINGS` environment - variable - * any other CPython specific filters (e.g. the ``default`` filter added - for the new ``-X dev`` mode) - * any implicit filters defined directly by the warnings machinery + * the ``BytesWarning`` filter for :option:`-b` (or ``-bb``) + * any filters specified with the :option:`-W` option + * any filters specified with the :envvar:`PYTHONWARNINGS` environment + variable + * any other CPython specific filters (e.g. the ``default`` filter added + for the new ``-X dev`` mode) + * any implicit filters defined directly by the warnings machinery * in :ref:`CPython debug builds `, all warnings are now displayed by default (the implicit filter list is empty) From 79b81d1825cde3e051aa43d9e27edfbaeda84fda Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 11 Oct 2023 13:13:06 +0200 Subject: [PATCH 062/265] [3.11] gh-76106: Remove the cleanup lock in test_socket (GH-110539) (GH-110700) It does not already work (because it locks only addCleanup(), not doCleanups()), and it is no longer needed since the clean up procedure waits for all test threads to join. (cherry picked from commit f27b83090701b9c215e0d65f1f924fb9330cb649) Co-authored-by: Serhiy Storchaka --- Lib/test/test_socket.py | 26 ++------------------------ 1 file changed, 2 insertions(+), 24 deletions(-) diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py index bc93d9988eaf03..3fceac340a461c 100644 --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -201,26 +201,6 @@ def setUp(self): self.serv = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDPLITE) self.port = socket_helper.bind_port(self.serv) -class ThreadSafeCleanupTestCase: - """Subclass of unittest.TestCase with thread-safe cleanup methods. - - This subclass protects the addCleanup() method with a recursive lock. - - doCleanups() is called when the server completed, but the client can still - be running in its thread especially if the server failed with a timeout. - Don't put a lock on doCleanups() to prevent deadlock between addCleanup() - called in the client and doCleanups() waiting for self.done.wait of - ThreadableTest._setUp() (gh-110167) - """ - - def __init__(self, *args, **kwargs): - super().__init__(*args, **kwargs) - self._cleanup_lock = threading.RLock() - - def addCleanup(self, *args, **kwargs): - with self._cleanup_lock: - return super().addCleanup(*args, **kwargs) - class SocketCANTest(unittest.TestCase): @@ -613,8 +593,7 @@ def setUp(self): self.serv.listen() -class ThreadedSocketTestMixin(ThreadSafeCleanupTestCase, SocketTestBase, - ThreadableTest): +class ThreadedSocketTestMixin(SocketTestBase, ThreadableTest): """Mixin to add client socket and allow client/server tests. Client socket is self.cli and its address is self.cli_addr. See @@ -2698,7 +2677,7 @@ def _testRecvFromNegative(self): # here assumes that datagram delivery on the local machine will be # reliable. -class SendrecvmsgBase(ThreadSafeCleanupTestCase): +class SendrecvmsgBase: # Base class for sendmsg()/recvmsg() tests. # Time in seconds to wait before considering a test failed, or @@ -4564,7 +4543,6 @@ def testInterruptedRecvmsgIntoTimeout(self): @unittest.skipUnless(hasattr(signal, "alarm") or hasattr(signal, "setitimer"), "Don't have signal.alarm or signal.setitimer") class InterruptedSendTimeoutTest(InterruptedTimeoutBase, - ThreadSafeCleanupTestCase, SocketListeningTestMixin, TCPTestBase): # Test interrupting the interruptible send*() methods with signals # when a timeout is set. From ad504156372932922409b75d051a755008e40fb6 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 11 Oct 2023 21:42:24 +0200 Subject: [PATCH 063/265] [3.11] gh-99834: Update macOS installer to Tcl/Tk 8.6.13. (GH-110710) (cherry picked from commit 13e460086b007691f2ca1c5ff677cdb70d19eba8) Co-authored-by: Ned Deily --- Mac/BuildScript/build-installer.py | 6 +++--- .../macOS/2023-05-21-23-54-52.gh-issue-99834.6ANPts.rst | 1 + 2 files changed, 4 insertions(+), 3 deletions(-) create mode 100644 Misc/NEWS.d/next/macOS/2023-05-21-23-54-52.gh-issue-99834.6ANPts.rst diff --git a/Mac/BuildScript/build-installer.py b/Mac/BuildScript/build-installer.py index 553926852a7ca8..f3728e3259e514 100755 --- a/Mac/BuildScript/build-installer.py +++ b/Mac/BuildScript/build-installer.py @@ -264,10 +264,10 @@ def library_recipes(): tk_patches = ['tk868_on_10_8_10_9.patch'] else: - tcl_tk_ver='8.6.12' - tcl_checksum='87ea890821d2221f2ab5157bc5eb885f' + tcl_tk_ver='8.6.13' + tcl_checksum='43a1fae7412f61ff11de2cfd05d28cfc3a73762f354a417c62370a54e2caf066' - tk_checksum='1d6dcf6120356e3d211e056dff5e462a' + tk_checksum='2e65fa069a23365440a3c56c556b8673b5e32a283800d8d9b257e3f584ce0675' tk_patches = [ ] diff --git a/Misc/NEWS.d/next/macOS/2023-05-21-23-54-52.gh-issue-99834.6ANPts.rst b/Misc/NEWS.d/next/macOS/2023-05-21-23-54-52.gh-issue-99834.6ANPts.rst new file mode 100644 index 00000000000000..aeb64d25b09add --- /dev/null +++ b/Misc/NEWS.d/next/macOS/2023-05-21-23-54-52.gh-issue-99834.6ANPts.rst @@ -0,0 +1 @@ +Update macOS installer to Tcl/Tk 8.6.13. From fd061a9bbe5b46c295ebe0d76b4411d0345efc7d Mon Sep 17 00:00:00 2001 From: Hugo van Kemenade Date: Wed, 11 Oct 2023 22:48:03 +0200 Subject: [PATCH 064/265] [3.11] gh-108826: Document `dis` module CLI and rename `_test` function to `main` (#108827) (#110689) Co-authored-by: Hugo van Kemenade Co-authored-by: blurb-it[bot] <43283697+blurb-it[bot]@users.noreply.github.com> Co-authored-by: Radislav Chugunov <52372310+chgnrdv@users.noreply.github.com> --- Doc/library/asyncio.rst | 2 + Doc/library/cmdline.rst | 55 +++++++++++++++++++ Doc/library/compileall.rst | 2 + Doc/library/dis.rst | 26 +++++++++ Doc/library/gzip.rst | 2 + Doc/library/index.rst | 1 + Doc/library/pickletools.rst | 2 + Doc/library/profile.rst | 2 + Doc/library/py_compile.rst | 1 + Doc/library/sysconfig.rst | 1 + Lib/dis.py | 5 +- ...-09-03-13-43-49.gh-issue-108826.KG7abS.rst | 1 + 12 files changed, 97 insertions(+), 3 deletions(-) create mode 100644 Doc/library/cmdline.rst create mode 100644 Misc/NEWS.d/next/Documentation/2023-09-03-13-43-49.gh-issue-108826.KG7abS.rst diff --git a/Doc/library/asyncio.rst b/Doc/library/asyncio.rst index c6a046f534e9a1..c75ab47404c1e4 100644 --- a/Doc/library/asyncio.rst +++ b/Doc/library/asyncio.rst @@ -56,6 +56,8 @@ Additionally, there are **low-level** APIs for * :ref:`bridge ` callback-based libraries and code with async/await syntax. +.. _asyncio-cli: + You can experiment with an ``asyncio`` concurrent context in the REPL: .. code-block:: pycon diff --git a/Doc/library/cmdline.rst b/Doc/library/cmdline.rst new file mode 100644 index 00000000000000..af137b8ec34c09 --- /dev/null +++ b/Doc/library/cmdline.rst @@ -0,0 +1,55 @@ +++++++++++++++++++++++++++++++++++++ +Modules command-line interface (CLI) +++++++++++++++++++++++++++++++++++++ + +The following modules have a command-line interface. + +* :ref:`ast ` +* :ref:`asyncio ` +* :mod:`base64` +* :ref:`calendar ` +* :mod:`code` +* :ref:`compileall ` +* :mod:`cProfile`: see :ref:`profile ` +* :ref:`difflib ` +* :ref:`dis ` +* :mod:`doctest` +* :mod:`!encodings.rot_13` +* :mod:`ensurepip` +* :mod:`filecmp` +* :mod:`fileinput` +* :mod:`ftplib` +* :ref:`gzip ` +* :ref:`http.server ` +* :mod:`!idlelib` +* :ref:`inspect ` +* :ref:`json.tool ` +* :mod:`mimetypes` +* :mod:`pdb` +* :mod:`pickle` +* :ref:`pickletools ` +* :mod:`platform` +* :mod:`poplib` +* :ref:`profile ` +* :mod:`pstats` +* :ref:`py_compile ` +* :mod:`pyclbr` +* :mod:`pydoc` +* :mod:`quopri` +* :mod:`runpy` +* :ref:`site ` +* :ref:`sysconfig ` +* :mod:`tabnanny` +* :ref:`tarfile ` +* :mod:`!this` +* :ref:`timeit ` +* :ref:`tokenize ` +* :ref:`trace ` +* :mod:`turtledemo` +* :ref:`unittest ` +* :mod:`venv` +* :mod:`webbrowser` +* :ref:`zipapp ` +* :ref:`zipfile ` + +See also the :ref:`Python command-line interface `. diff --git a/Doc/library/compileall.rst b/Doc/library/compileall.rst index a1482c9eb889e8..6d16734ddca21e 100644 --- a/Doc/library/compileall.rst +++ b/Doc/library/compileall.rst @@ -16,6 +16,8 @@ have write permission to the library directories. .. include:: ../includes/wasm-notavail.rst +.. _compileall-cli: + Command-line use ---------------- diff --git a/Doc/library/dis.rst b/Doc/library/dis.rst index f56784e769a3ee..a12f5937cd379e 100644 --- a/Doc/library/dis.rst +++ b/Doc/library/dis.rst @@ -64,6 +64,32 @@ the following command can be used to display the disassembly of (The "2" is a line number). +.. _dis-cli: + +Command-line interface +---------------------- + +The :mod:`dis` module can be invoked as a script from the command line: + +.. code-block:: sh + + python -m dis [-h] [-C] [infile] + +The following options are accepted: + +.. program:: dis + +.. cmdoption:: -h, --help + + Display usage and exit. + +.. cmdoption:: -C, --show-caches + + Show inline caches. + +If :file:`infile` is specified, its disassembled code will be written to stdout. +Otherwise, disassembly is performed on compiled source code recieved from stdin. + Bytecode analysis ----------------- diff --git a/Doc/library/gzip.rst b/Doc/library/gzip.rst index 4d2fa3d4012a1d..fdb546236aeadc 100644 --- a/Doc/library/gzip.rst +++ b/Doc/library/gzip.rst @@ -246,6 +246,8 @@ Example of how to GZIP compress a binary string:: .. program:: gzip +.. _gzip-cli: + Command Line Interface ---------------------- diff --git a/Doc/library/index.rst b/Doc/library/index.rst index d064b680f9aaa4..0b348ae6f5c8c0 100644 --- a/Doc/library/index.rst +++ b/Doc/library/index.rst @@ -73,5 +73,6 @@ the `Python Package Index `_. language.rst windows.rst unix.rst + cmdline.rst superseded.rst security_warnings.rst diff --git a/Doc/library/pickletools.rst b/Doc/library/pickletools.rst index c6ff4e6ed3d3b5..41930f8cbe8412 100644 --- a/Doc/library/pickletools.rst +++ b/Doc/library/pickletools.rst @@ -17,6 +17,8 @@ are useful for Python core developers who are working on the :mod:`pickle`; ordinary users of the :mod:`pickle` module probably won't find the :mod:`pickletools` module relevant. +.. _pickletools-cli: + Command line usage ------------------ diff --git a/Doc/library/profile.rst b/Doc/library/profile.rst index 723f927135a0f4..69274b0c354a25 100644 --- a/Doc/library/profile.rst +++ b/Doc/library/profile.rst @@ -121,6 +121,8 @@ results to a file by specifying a filename to the :func:`run` function:: The :class:`pstats.Stats` class reads profile results from a file and formats them in various ways. +.. _profile-cli: + The files :mod:`cProfile` and :mod:`profile` can also be invoked as a script to profile another script. For example:: diff --git a/Doc/library/py_compile.rst b/Doc/library/py_compile.rst index 7272f36d9a5568..38c416f9ad0305 100644 --- a/Doc/library/py_compile.rst +++ b/Doc/library/py_compile.rst @@ -125,6 +125,7 @@ byte-code cache files in the directory containing the source code. This option is useful when the ``.pycs`` are kept up to date by some system external to Python like a build system. +.. _py_compile-cli: Command-Line Interface ---------------------- diff --git a/Doc/library/sysconfig.rst b/Doc/library/sysconfig.rst index 17448f2a2abace..2de47cfd36a0f7 100644 --- a/Doc/library/sysconfig.rst +++ b/Doc/library/sysconfig.rst @@ -278,6 +278,7 @@ Other functions Return the path of :file:`Makefile`. +.. _sysconfig-cli: Using :mod:`sysconfig` as a script ---------------------------------- diff --git a/Lib/dis.py b/Lib/dis.py index 5bf52c3e6d3ac3..196c886f785082 100644 --- a/Lib/dis.py +++ b/Lib/dis.py @@ -759,8 +759,7 @@ def dis(self): return output.getvalue() -def _test(): - """Simple test program to disassemble a file.""" +def main(): import argparse parser = argparse.ArgumentParser() @@ -772,4 +771,4 @@ def _test(): dis(code) if __name__ == "__main__": - _test() + main() diff --git a/Misc/NEWS.d/next/Documentation/2023-09-03-13-43-49.gh-issue-108826.KG7abS.rst b/Misc/NEWS.d/next/Documentation/2023-09-03-13-43-49.gh-issue-108826.KG7abS.rst new file mode 100644 index 00000000000000..139b8f3457930c --- /dev/null +++ b/Misc/NEWS.d/next/Documentation/2023-09-03-13-43-49.gh-issue-108826.KG7abS.rst @@ -0,0 +1 @@ +:mod:`dis` module command-line interface is now mentioned in documentation. From 07471cda29fbd5cba90f5f9a8b221e43d8247bf0 Mon Sep 17 00:00:00 2001 From: Ezio Melotti Date: Wed, 11 Oct 2023 23:11:41 +0200 Subject: [PATCH 065/265] [3.11] gh-110631: Fix reST indentation in `Doc/library` (GH-110685) (#110737) * [3.11] gh-110631: Fix reST indentation in `Doc/library` (GH-110685) Fix wrong indentation in the Doc/library dir.. (cherry picked from commit bb7923f556537a463c403dc1097726d8a8e1a6f2) Co-authored-by: Ezio Melotti * Fix merge glitch. --- Doc/library/__main__.rst | 46 +- Doc/library/_thread.rst | 2 +- Doc/library/binascii.rst | 9 +- Doc/library/collections.rst | 78 +-- Doc/library/concurrent.futures.rst | 270 +++++----- Doc/library/ctypes.rst | 84 +-- Doc/library/curses.rst | 6 +- Doc/library/dialog.rst | 12 +- Doc/library/email.contentmanager.rst | 50 +- Doc/library/email.policy.rst | 18 +- Doc/library/enum.rst | 28 +- Doc/library/functions.rst | 30 +- Doc/library/graphlib.rst | 16 +- Doc/library/idle.rst | 22 +- Doc/library/inspect.rst | 18 +- Doc/library/io.rst | 10 +- Doc/library/logging.config.rst | 14 +- Doc/library/lzma.rst | 51 +- Doc/library/multiprocessing.rst | 28 +- Doc/library/numbers.rst | 34 +- Doc/library/profile.rst | 8 +- Doc/library/re.rst | 2 +- Doc/library/shelve.rst | 6 +- Doc/library/socket.rst | 4 +- Doc/library/sqlite3.rst | 24 +- Doc/library/ssl.rst | 24 +- Doc/library/stdtypes.rst | 6 +- Doc/library/string.rst | 100 ++-- Doc/library/subprocess.rst | 28 +- Doc/library/tkinter.rst | 36 +- Doc/library/tkinter.ttk.rst | 748 +++++++++++++-------------- Doc/library/unittest.mock.rst | 10 +- Doc/library/urllib.request.rst | 12 +- 33 files changed, 920 insertions(+), 914 deletions(-) diff --git a/Doc/library/__main__.rst b/Doc/library/__main__.rst index c35f67245c97ec..435310d525d754 100644 --- a/Doc/library/__main__.rst +++ b/Doc/library/__main__.rst @@ -54,45 +54,45 @@ The top-level code environment can be: * the scope of an interactive prompt:: - >>> __name__ - '__main__' + >>> __name__ + '__main__' * the Python module passed to the Python interpreter as a file argument: - .. code-block:: shell-session + .. code-block:: shell-session - $ python3 helloworld.py - Hello, world! + $ python3 helloworld.py + Hello, world! * the Python module or package passed to the Python interpreter with the :option:`-m` argument: - .. code-block:: shell-session + .. code-block:: shell-session - $ python3 -m tarfile - usage: tarfile.py [-h] [-v] (...) + $ python3 -m tarfile + usage: tarfile.py [-h] [-v] (...) * Python code read by the Python interpreter from standard input: - .. code-block:: shell-session + .. code-block:: shell-session - $ echo "import this" | python3 - The Zen of Python, by Tim Peters + $ echo "import this" | python3 + The Zen of Python, by Tim Peters - Beautiful is better than ugly. - Explicit is better than implicit. - ... + Beautiful is better than ugly. + Explicit is better than implicit. + ... * Python code passed to the Python interpreter with the :option:`-c` argument: - .. code-block:: shell-session + .. code-block:: shell-session - $ python3 -c "import this" - The Zen of Python, by Tim Peters + $ python3 -c "import this" + The Zen of Python, by Tim Peters - Beautiful is better than ugly. - Explicit is better than implicit. - ... + Beautiful is better than ugly. + Explicit is better than implicit. + ... In each of these situations, the top-level module's ``__name__`` is set to ``'__main__'``. @@ -102,9 +102,9 @@ top-level environment by checking its own ``__name__``, which allows a common idiom for conditionally executing code when the module is not initialized from an import statement:: - if __name__ == '__main__': - # Execute when the module is not initialized from an import statement. - ... + if __name__ == '__main__': + # Execute when the module is not initialized from an import statement. + ... .. seealso:: diff --git a/Doc/library/_thread.rst b/Doc/library/_thread.rst index 4227a4fe1e33a2..5e506d2c37d6e0 100644 --- a/Doc/library/_thread.rst +++ b/Doc/library/_thread.rst @@ -206,7 +206,7 @@ In addition to these methods, lock objects can also be used via the **Caveats:** - .. index:: pair: module; signal +.. index:: pair: module; signal * Threads interact strangely with interrupts: the :exc:`KeyboardInterrupt` exception will be received by an arbitrary thread. (When the :mod:`signal` diff --git a/Doc/library/binascii.rst b/Doc/library/binascii.rst index 21960cb7972e6e..d065fa3506da11 100644 --- a/Doc/library/binascii.rst +++ b/Doc/library/binascii.rst @@ -58,10 +58,11 @@ The :mod:`binascii` module defines the following functions: data will raise :exc:`binascii.Error`. Valid base64: - * Conforms to :rfc:`3548`. - * Contains only characters from the base64 alphabet. - * Contains no excess data after padding (including excess padding, newlines, etc.). - * Does not start with a padding. + + * Conforms to :rfc:`3548`. + * Contains only characters from the base64 alphabet. + * Contains no excess data after padding (including excess padding, newlines, etc.). + * Does not start with a padding. .. versionchanged:: 3.11 Added the *strict_mode* parameter. diff --git a/Doc/library/collections.rst b/Doc/library/collections.rst index 285d362104119a..5ea3a5dd6254c7 100644 --- a/Doc/library/collections.rst +++ b/Doc/library/collections.rst @@ -120,26 +120,26 @@ The class can be used to simulate nested scopes and is useful in templating. .. seealso:: - * The `MultiContext class - `_ - in the Enthought `CodeTools package - `_ has options to support - writing to any mapping in the chain. + * The `MultiContext class + `_ + in the Enthought `CodeTools package + `_ has options to support + writing to any mapping in the chain. - * Django's `Context class - `_ - for templating is a read-only chain of mappings. It also features - pushing and popping of contexts similar to the - :meth:`~collections.ChainMap.new_child` method and the - :attr:`~collections.ChainMap.parents` property. + * Django's `Context class + `_ + for templating is a read-only chain of mappings. It also features + pushing and popping of contexts similar to the + :meth:`~collections.ChainMap.new_child` method and the + :attr:`~collections.ChainMap.parents` property. - * The `Nested Contexts recipe - `_ has options to control - whether writes and other mutations apply only to the first mapping or to - any mapping in the chain. + * The `Nested Contexts recipe + `_ has options to control + whether writes and other mutations apply only to the first mapping or to + any mapping in the chain. - * A `greatly simplified read-only version of Chainmap - `_. + * A `greatly simplified read-only version of Chainmap + `_. :class:`ChainMap` Examples and Recipes @@ -428,22 +428,22 @@ or subtracting from an empty counter. .. seealso:: - * `Bag class `_ - in Smalltalk. + * `Bag class `_ + in Smalltalk. - * Wikipedia entry for `Multisets `_. + * Wikipedia entry for `Multisets `_. - * `C++ multisets `_ - tutorial with examples. + * `C++ multisets `_ + tutorial with examples. - * For mathematical operations on multisets and their use cases, see - *Knuth, Donald. The Art of Computer Programming Volume II, - Section 4.6.3, Exercise 19*. + * For mathematical operations on multisets and their use cases, see + *Knuth, Donald. The Art of Computer Programming Volume II, + Section 4.6.3, Exercise 19*. - * To enumerate all distinct multisets of a given size over a given set of - elements, see :func:`itertools.combinations_with_replacement`:: + * To enumerate all distinct multisets of a given size over a given set of + elements, see :func:`itertools.combinations_with_replacement`:: - map(Counter, combinations_with_replacement('ABC', 2)) # --> AA AB AC BB BC CC + map(Counter, combinations_with_replacement('ABC', 2)) # --> AA AB AC BB BC CC :class:`deque` objects @@ -1058,20 +1058,20 @@ fields: .. seealso:: - * See :class:`typing.NamedTuple` for a way to add type hints for named - tuples. It also provides an elegant notation using the :keyword:`class` - keyword:: + * See :class:`typing.NamedTuple` for a way to add type hints for named + tuples. It also provides an elegant notation using the :keyword:`class` + keyword:: - class Component(NamedTuple): - part_number: int - weight: float - description: Optional[str] = None + class Component(NamedTuple): + part_number: int + weight: float + description: Optional[str] = None - * See :meth:`types.SimpleNamespace` for a mutable namespace based on an - underlying dictionary instead of a tuple. + * See :meth:`types.SimpleNamespace` for a mutable namespace based on an + underlying dictionary instead of a tuple. - * The :mod:`dataclasses` module provides a decorator and functions for - automatically adding generated special methods to user-defined classes. + * The :mod:`dataclasses` module provides a decorator and functions for + automatically adding generated special methods to user-defined classes. :class:`OrderedDict` objects diff --git a/Doc/library/concurrent.futures.rst b/Doc/library/concurrent.futures.rst index c2c64d9c6ee769..8ab2c3a2be2d65 100644 --- a/Doc/library/concurrent.futures.rst +++ b/Doc/library/concurrent.futures.rst @@ -29,83 +29,83 @@ Executor Objects An abstract class that provides methods to execute calls asynchronously. It should not be used directly, but through its concrete subclasses. - .. method:: submit(fn, /, *args, **kwargs) + .. method:: submit(fn, /, *args, **kwargs) - Schedules the callable, *fn*, to be executed as ``fn(*args, **kwargs)`` - and returns a :class:`Future` object representing the execution of the - callable. :: + Schedules the callable, *fn*, to be executed as ``fn(*args, **kwargs)`` + and returns a :class:`Future` object representing the execution of the + callable. :: - with ThreadPoolExecutor(max_workers=1) as executor: - future = executor.submit(pow, 323, 1235) - print(future.result()) + with ThreadPoolExecutor(max_workers=1) as executor: + future = executor.submit(pow, 323, 1235) + print(future.result()) - .. method:: map(func, *iterables, timeout=None, chunksize=1) + .. method:: map(func, *iterables, timeout=None, chunksize=1) - Similar to :func:`map(func, *iterables) ` except: + Similar to :func:`map(func, *iterables) ` except: - * the *iterables* are collected immediately rather than lazily; + * the *iterables* are collected immediately rather than lazily; - * *func* is executed asynchronously and several calls to - *func* may be made concurrently. + * *func* is executed asynchronously and several calls to + *func* may be made concurrently. - The returned iterator raises a :exc:`TimeoutError` - if :meth:`~iterator.__next__` is called and the result isn't available - after *timeout* seconds from the original call to :meth:`Executor.map`. - *timeout* can be an int or a float. If *timeout* is not specified or - ``None``, there is no limit to the wait time. + The returned iterator raises a :exc:`TimeoutError` + if :meth:`~iterator.__next__` is called and the result isn't available + after *timeout* seconds from the original call to :meth:`Executor.map`. + *timeout* can be an int or a float. If *timeout* is not specified or + ``None``, there is no limit to the wait time. - If a *func* call raises an exception, then that exception will be - raised when its value is retrieved from the iterator. + If a *func* call raises an exception, then that exception will be + raised when its value is retrieved from the iterator. - When using :class:`ProcessPoolExecutor`, this method chops *iterables* - into a number of chunks which it submits to the pool as separate - tasks. The (approximate) size of these chunks can be specified by - setting *chunksize* to a positive integer. For very long iterables, - using a large value for *chunksize* can significantly improve - performance compared to the default size of 1. With - :class:`ThreadPoolExecutor`, *chunksize* has no effect. + When using :class:`ProcessPoolExecutor`, this method chops *iterables* + into a number of chunks which it submits to the pool as separate + tasks. The (approximate) size of these chunks can be specified by + setting *chunksize* to a positive integer. For very long iterables, + using a large value for *chunksize* can significantly improve + performance compared to the default size of 1. With + :class:`ThreadPoolExecutor`, *chunksize* has no effect. - .. versionchanged:: 3.5 - Added the *chunksize* argument. + .. versionchanged:: 3.5 + Added the *chunksize* argument. - .. method:: shutdown(wait=True, *, cancel_futures=False) + .. method:: shutdown(wait=True, *, cancel_futures=False) - Signal the executor that it should free any resources that it is using - when the currently pending futures are done executing. Calls to - :meth:`Executor.submit` and :meth:`Executor.map` made after shutdown will - raise :exc:`RuntimeError`. + Signal the executor that it should free any resources that it is using + when the currently pending futures are done executing. Calls to + :meth:`Executor.submit` and :meth:`Executor.map` made after shutdown will + raise :exc:`RuntimeError`. - If *wait* is ``True`` then this method will not return until all the - pending futures are done executing and the resources associated with the - executor have been freed. If *wait* is ``False`` then this method will - return immediately and the resources associated with the executor will be - freed when all pending futures are done executing. Regardless of the - value of *wait*, the entire Python program will not exit until all - pending futures are done executing. + If *wait* is ``True`` then this method will not return until all the + pending futures are done executing and the resources associated with the + executor have been freed. If *wait* is ``False`` then this method will + return immediately and the resources associated with the executor will be + freed when all pending futures are done executing. Regardless of the + value of *wait*, the entire Python program will not exit until all + pending futures are done executing. - If *cancel_futures* is ``True``, this method will cancel all pending - futures that the executor has not started running. Any futures that - are completed or running won't be cancelled, regardless of the value - of *cancel_futures*. + If *cancel_futures* is ``True``, this method will cancel all pending + futures that the executor has not started running. Any futures that + are completed or running won't be cancelled, regardless of the value + of *cancel_futures*. - If both *cancel_futures* and *wait* are ``True``, all futures that the - executor has started running will be completed prior to this method - returning. The remaining futures are cancelled. + If both *cancel_futures* and *wait* are ``True``, all futures that the + executor has started running will be completed prior to this method + returning. The remaining futures are cancelled. - You can avoid having to call this method explicitly if you use the - :keyword:`with` statement, which will shutdown the :class:`Executor` - (waiting as if :meth:`Executor.shutdown` were called with *wait* set to - ``True``):: + You can avoid having to call this method explicitly if you use the + :keyword:`with` statement, which will shutdown the :class:`Executor` + (waiting as if :meth:`Executor.shutdown` were called with *wait* set to + ``True``):: - import shutil - with ThreadPoolExecutor(max_workers=4) as e: - e.submit(shutil.copy, 'src1.txt', 'dest1.txt') - e.submit(shutil.copy, 'src2.txt', 'dest2.txt') - e.submit(shutil.copy, 'src3.txt', 'dest3.txt') - e.submit(shutil.copy, 'src4.txt', 'dest4.txt') + import shutil + with ThreadPoolExecutor(max_workers=4) as e: + e.submit(shutil.copy, 'src1.txt', 'dest1.txt') + e.submit(shutil.copy, 'src2.txt', 'dest2.txt') + e.submit(shutil.copy, 'src3.txt', 'dest3.txt') + e.submit(shutil.copy, 'src4.txt', 'dest4.txt') - .. versionchanged:: 3.9 - Added *cancel_futures*. + .. versionchanged:: 3.9 + Added *cancel_futures*. ThreadPoolExecutor @@ -337,117 +337,117 @@ The :class:`Future` class encapsulates the asynchronous execution of a callable. instances are created by :meth:`Executor.submit` and should not be created directly except for testing. - .. method:: cancel() + .. method:: cancel() - Attempt to cancel the call. If the call is currently being executed or - finished running and cannot be cancelled then the method will return - ``False``, otherwise the call will be cancelled and the method will - return ``True``. + Attempt to cancel the call. If the call is currently being executed or + finished running and cannot be cancelled then the method will return + ``False``, otherwise the call will be cancelled and the method will + return ``True``. - .. method:: cancelled() + .. method:: cancelled() - Return ``True`` if the call was successfully cancelled. + Return ``True`` if the call was successfully cancelled. - .. method:: running() + .. method:: running() - Return ``True`` if the call is currently being executed and cannot be - cancelled. + Return ``True`` if the call is currently being executed and cannot be + cancelled. - .. method:: done() + .. method:: done() - Return ``True`` if the call was successfully cancelled or finished - running. + Return ``True`` if the call was successfully cancelled or finished + running. - .. method:: result(timeout=None) + .. method:: result(timeout=None) - Return the value returned by the call. If the call hasn't yet completed - then this method will wait up to *timeout* seconds. If the call hasn't - completed in *timeout* seconds, then a - :exc:`TimeoutError` will be raised. *timeout* can be - an int or float. If *timeout* is not specified or ``None``, there is no - limit to the wait time. + Return the value returned by the call. If the call hasn't yet completed + then this method will wait up to *timeout* seconds. If the call hasn't + completed in *timeout* seconds, then a + :exc:`TimeoutError` will be raised. *timeout* can be + an int or float. If *timeout* is not specified or ``None``, there is no + limit to the wait time. - If the future is cancelled before completing then :exc:`.CancelledError` - will be raised. + If the future is cancelled before completing then :exc:`.CancelledError` + will be raised. - If the call raised an exception, this method will raise the same exception. + If the call raised an exception, this method will raise the same exception. - .. method:: exception(timeout=None) + .. method:: exception(timeout=None) - Return the exception raised by the call. If the call hasn't yet - completed then this method will wait up to *timeout* seconds. If the - call hasn't completed in *timeout* seconds, then a - :exc:`TimeoutError` will be raised. *timeout* can be - an int or float. If *timeout* is not specified or ``None``, there is no - limit to the wait time. + Return the exception raised by the call. If the call hasn't yet + completed then this method will wait up to *timeout* seconds. If the + call hasn't completed in *timeout* seconds, then a + :exc:`TimeoutError` will be raised. *timeout* can be + an int or float. If *timeout* is not specified or ``None``, there is no + limit to the wait time. - If the future is cancelled before completing then :exc:`.CancelledError` - will be raised. + If the future is cancelled before completing then :exc:`.CancelledError` + will be raised. - If the call completed without raising, ``None`` is returned. + If the call completed without raising, ``None`` is returned. - .. method:: add_done_callback(fn) + .. method:: add_done_callback(fn) - Attaches the callable *fn* to the future. *fn* will be called, with the - future as its only argument, when the future is cancelled or finishes - running. + Attaches the callable *fn* to the future. *fn* will be called, with the + future as its only argument, when the future is cancelled or finishes + running. - Added callables are called in the order that they were added and are - always called in a thread belonging to the process that added them. If - the callable raises an :exc:`Exception` subclass, it will be logged and - ignored. If the callable raises a :exc:`BaseException` subclass, the - behavior is undefined. + Added callables are called in the order that they were added and are + always called in a thread belonging to the process that added them. If + the callable raises an :exc:`Exception` subclass, it will be logged and + ignored. If the callable raises a :exc:`BaseException` subclass, the + behavior is undefined. - If the future has already completed or been cancelled, *fn* will be - called immediately. + If the future has already completed or been cancelled, *fn* will be + called immediately. The following :class:`Future` methods are meant for use in unit tests and :class:`Executor` implementations. - .. method:: set_running_or_notify_cancel() + .. method:: set_running_or_notify_cancel() - This method should only be called by :class:`Executor` implementations - before executing the work associated with the :class:`Future` and by unit - tests. + This method should only be called by :class:`Executor` implementations + before executing the work associated with the :class:`Future` and by unit + tests. - If the method returns ``False`` then the :class:`Future` was cancelled, - i.e. :meth:`Future.cancel` was called and returned ``True``. Any threads - waiting on the :class:`Future` completing (i.e. through - :func:`as_completed` or :func:`wait`) will be woken up. + If the method returns ``False`` then the :class:`Future` was cancelled, + i.e. :meth:`Future.cancel` was called and returned ``True``. Any threads + waiting on the :class:`Future` completing (i.e. through + :func:`as_completed` or :func:`wait`) will be woken up. - If the method returns ``True`` then the :class:`Future` was not cancelled - and has been put in the running state, i.e. calls to - :meth:`Future.running` will return ``True``. + If the method returns ``True`` then the :class:`Future` was not cancelled + and has been put in the running state, i.e. calls to + :meth:`Future.running` will return ``True``. - This method can only be called once and cannot be called after - :meth:`Future.set_result` or :meth:`Future.set_exception` have been - called. + This method can only be called once and cannot be called after + :meth:`Future.set_result` or :meth:`Future.set_exception` have been + called. - .. method:: set_result(result) + .. method:: set_result(result) - Sets the result of the work associated with the :class:`Future` to - *result*. + Sets the result of the work associated with the :class:`Future` to + *result*. - This method should only be used by :class:`Executor` implementations and - unit tests. + This method should only be used by :class:`Executor` implementations and + unit tests. - .. versionchanged:: 3.8 - This method raises - :exc:`concurrent.futures.InvalidStateError` if the :class:`Future` is - already done. + .. versionchanged:: 3.8 + This method raises + :exc:`concurrent.futures.InvalidStateError` if the :class:`Future` is + already done. - .. method:: set_exception(exception) + .. method:: set_exception(exception) - Sets the result of the work associated with the :class:`Future` to the - :class:`Exception` *exception*. + Sets the result of the work associated with the :class:`Future` to the + :class:`Exception` *exception*. - This method should only be used by :class:`Executor` implementations and - unit tests. + This method should only be used by :class:`Executor` implementations and + unit tests. - .. versionchanged:: 3.8 - This method raises - :exc:`concurrent.futures.InvalidStateError` if the :class:`Future` is - already done. + .. versionchanged:: 3.8 + This method raises + :exc:`concurrent.futures.InvalidStateError` if the :class:`Future` is + already done. Module Functions ---------------- diff --git a/Doc/library/ctypes.rst b/Doc/library/ctypes.rst index 51cffc029017b2..0088775d2fd2a7 100644 --- a/Doc/library/ctypes.rst +++ b/Doc/library/ctypes.rst @@ -1697,70 +1697,70 @@ See :ref:`ctypes-callback-functions` for examples. Function prototypes created by these factory functions can be instantiated in different ways, depending on the type and number of the parameters in the call: +.. function:: prototype(address) + :noindex: + :module: - .. function:: prototype(address) - :noindex: - :module: + Returns a foreign function at the specified address which must be an integer. - Returns a foreign function at the specified address which must be an integer. +.. function:: prototype(callable) + :noindex: + :module: - .. function:: prototype(callable) - :noindex: - :module: + Create a C callable function (a callback function) from a Python *callable*. - Create a C callable function (a callback function) from a Python *callable*. +.. function:: prototype(func_spec[, paramflags]) + :noindex: + :module: - .. function:: prototype(func_spec[, paramflags]) - :noindex: - :module: + Returns a foreign function exported by a shared library. *func_spec* must + be a 2-tuple ``(name_or_ordinal, library)``. The first item is the name of + the exported function as string, or the ordinal of the exported function + as small integer. The second item is the shared library instance. - Returns a foreign function exported by a shared library. *func_spec* must - be a 2-tuple ``(name_or_ordinal, library)``. The first item is the name of - the exported function as string, or the ordinal of the exported function - as small integer. The second item is the shared library instance. +.. function:: prototype(vtbl_index, name[, paramflags[, iid]]) + :noindex: + :module: - .. function:: prototype(vtbl_index, name[, paramflags[, iid]]) - :noindex: - :module: + Returns a foreign function that will call a COM method. *vtbl_index* is + the index into the virtual function table, a small non-negative + integer. *name* is name of the COM method. *iid* is an optional pointer to + the interface identifier which is used in extended error reporting. - Returns a foreign function that will call a COM method. *vtbl_index* is - the index into the virtual function table, a small non-negative - integer. *name* is name of the COM method. *iid* is an optional pointer to - the interface identifier which is used in extended error reporting. + COM methods use a special calling convention: They require a pointer to + the COM interface as first argument, in addition to those parameters that + are specified in the :attr:`!argtypes` tuple. - COM methods use a special calling convention: They require a pointer to - the COM interface as first argument, in addition to those parameters that - are specified in the :attr:`argtypes` tuple. +The optional *paramflags* parameter creates foreign function wrappers with much +more functionality than the features described above. - The optional *paramflags* parameter creates foreign function wrappers with much - more functionality than the features described above. +*paramflags* must be a tuple of the same length as :attr:`~_FuncPtr.argtypes`. - *paramflags* must be a tuple of the same length as :attr:`argtypes`. +Each item in this tuple contains further information about a parameter, it must +be a tuple containing one, two, or three items. - Each item in this tuple contains further information about a parameter, it must - be a tuple containing one, two, or three items. +The first item is an integer containing a combination of direction +flags for the parameter: - The first item is an integer containing a combination of direction - flags for the parameter: + 1 + Specifies an input parameter to the function. - 1 - Specifies an input parameter to the function. + 2 + Output parameter. The foreign function fills in a value. - 2 - Output parameter. The foreign function fills in a value. + 4 + Input parameter which defaults to the integer zero. - 4 - Input parameter which defaults to the integer zero. +The optional second item is the parameter name as string. If this is specified, +the foreign function can be called with named parameters. - The optional second item is the parameter name as string. If this is specified, - the foreign function can be called with named parameters. +The optional third item is the default value for this parameter. - The optional third item is the default value for this parameter. -This example demonstrates how to wrap the Windows ``MessageBoxW`` function so +The following example demonstrates how to wrap the Windows ``MessageBoxW`` function so that it supports default parameters and named arguments. The C declaration from the windows header file is this:: diff --git a/Doc/library/curses.rst b/Doc/library/curses.rst index a17d528793eff9..0d743efd28efff 100644 --- a/Doc/library/curses.rst +++ b/Doc/library/curses.rst @@ -1774,9 +1774,9 @@ The following table lists mouse button constants used by :meth:`getmouse`: | .. data:: BUTTON_ALT | Control was down during button state change | +----------------------------------+---------------------------------------------+ - .. versionchanged:: 3.10 - The ``BUTTON5_*`` constants are now exposed if they are provided by the - underlying curses library. +.. versionchanged:: 3.10 + The ``BUTTON5_*`` constants are now exposed if they are provided by the + underlying curses library. The following table lists the predefined colors: diff --git a/Doc/library/dialog.rst b/Doc/library/dialog.rst index 53f98c1018988f..191e0da12103fa 100644 --- a/Doc/library/dialog.rst +++ b/Doc/library/dialog.rst @@ -27,15 +27,15 @@ functions for creating simple modal dialogs to get a value from the user. The base class for custom dialogs. - .. method:: body(master) + .. method:: body(master) - Override to construct the dialog's interface and return the widget that - should have initial focus. + Override to construct the dialog's interface and return the widget that + should have initial focus. - .. method:: buttonbox() + .. method:: buttonbox() - Default behaviour adds OK and Cancel buttons. Override for custom button - layouts. + Default behaviour adds OK and Cancel buttons. Override for custom button + layouts. diff --git a/Doc/library/email.contentmanager.rst b/Doc/library/email.contentmanager.rst index 918fc55677e723..5b49339650f0e9 100644 --- a/Doc/library/email.contentmanager.rst +++ b/Doc/library/email.contentmanager.rst @@ -32,9 +32,9 @@ To find the handler, look for the following keys in the registry, stopping with the first one found: - * the string representing the full MIME type (``maintype/subtype``) - * the string representing the ``maintype`` - * the empty string + * the string representing the full MIME type (``maintype/subtype``) + * the string representing the ``maintype`` + * the empty string If none of these keys produce a handler, raise a :exc:`KeyError` for the full MIME type. @@ -55,11 +55,11 @@ look for the following keys in the registry, stopping with the first one found: - * the type itself (``typ``) - * the type's fully qualified name (``typ.__module__ + '.' + - typ.__qualname__``). - * the type's qualname (``typ.__qualname__``) - * the type's name (``typ.__name__``). + * the type itself (``typ``) + * the type's fully qualified name (``typ.__module__ + '.' + + typ.__qualname__``). + * the type's qualname (``typ.__qualname__``) + * the type's name (``typ.__name__``). If none of the above match, repeat all of the checks above for each of the types in the :term:`MRO` (``typ.__mro__``). Finally, if no other key @@ -132,15 +132,15 @@ Currently the email package provides only one concrete content manager, Add a :mailheader:`Content-Type` header with a ``maintype/subtype`` value. - * For ``str``, set the MIME ``maintype`` to ``text``, and set the - subtype to *subtype* if it is specified, or ``plain`` if it is not. - * For ``bytes``, use the specified *maintype* and *subtype*, or - raise a :exc:`TypeError` if they are not specified. - * For :class:`~email.message.EmailMessage` objects, set the maintype - to ``message``, and set the subtype to *subtype* if it is - specified or ``rfc822`` if it is not. If *subtype* is - ``partial``, raise an error (``bytes`` objects must be used to - construct ``message/partial`` parts). + * For ``str``, set the MIME ``maintype`` to ``text``, and set the + subtype to *subtype* if it is specified, or ``plain`` if it is not. + * For ``bytes``, use the specified *maintype* and *subtype*, or + raise a :exc:`TypeError` if they are not specified. + * For :class:`~email.message.EmailMessage` objects, set the maintype + to ``message``, and set the subtype to *subtype* if it is + specified or ``rfc822`` if it is not. If *subtype* is + ``partial``, raise an error (``bytes`` objects must be used to + construct ``message/partial`` parts). If *charset* is provided (which is valid only for ``str``), encode the string to bytes using the specified character set. The default is @@ -155,14 +155,14 @@ Currently the email package provides only one concrete content manager, ``7bit`` for an input that contains non-ASCII values), raise a :exc:`ValueError`. - * For ``str`` objects, if *cte* is not set use heuristics to - determine the most compact encoding. - * For :class:`~email.message.EmailMessage`, per :rfc:`2046`, raise - an error if a *cte* of ``quoted-printable`` or ``base64`` is - requested for *subtype* ``rfc822``, and for any *cte* other than - ``7bit`` for *subtype* ``external-body``. For - ``message/rfc822``, use ``8bit`` if *cte* is not specified. For - all other values of *subtype*, use ``7bit``. + * For ``str`` objects, if *cte* is not set use heuristics to + determine the most compact encoding. + * For :class:`~email.message.EmailMessage`, per :rfc:`2046`, raise + an error if a *cte* of ``quoted-printable`` or ``base64`` is + requested for *subtype* ``rfc822``, and for any *cte* other than + ``7bit`` for *subtype* ``external-body``. For + ``message/rfc822``, use ``8bit`` if *cte* is not specified. For + all other values of *subtype*, use ``7bit``. .. note:: A *cte* of ``binary`` does not actually work correctly yet. The ``EmailMessage`` object as modified by ``set_content`` is diff --git a/Doc/library/email.policy.rst b/Doc/library/email.policy.rst index bf53b9520fc723..9b462fab119561 100644 --- a/Doc/library/email.policy.rst +++ b/Doc/library/email.policy.rst @@ -556,17 +556,17 @@ more closely to the RFCs relevant to their domains. With all of these :class:`EmailPolicies <.EmailPolicy>`, the effective API of the email package is changed from the Python 3.2 API in the following ways: - * Setting a header on a :class:`~email.message.Message` results in that - header being parsed and a header object created. +* Setting a header on a :class:`~email.message.Message` results in that + header being parsed and a header object created. - * Fetching a header value from a :class:`~email.message.Message` results - in that header being parsed and a header object created and - returned. +* Fetching a header value from a :class:`~email.message.Message` results + in that header being parsed and a header object created and + returned. - * Any header object, or any header that is refolded due to the - policy settings, is folded using an algorithm that fully implements the - RFC folding algorithms, including knowing where encoded words are required - and allowed. +* Any header object, or any header that is refolded due to the + policy settings, is folded using an algorithm that fully implements the + RFC folding algorithms, including knowing where encoded words are required + and allowed. From the application view, this means that any header obtained through the :class:`~email.message.EmailMessage` is a header object with extra diff --git a/Doc/library/enum.rst b/Doc/library/enum.rst index 85da33156b1e38..8fb97ea9ed1c84 100644 --- a/Doc/library/enum.rst +++ b/Doc/library/enum.rst @@ -593,8 +593,8 @@ Data Types If a *Flag* operation is performed with an *IntFlag* member and: - * the result is a valid *IntFlag*: an *IntFlag* is returned - * the result is not a valid *IntFlag*: the result depends on the *FlagBoundary* setting + * the result is a valid *IntFlag*: an *IntFlag* is returned + * the result is not a valid *IntFlag*: the result depends on the *FlagBoundary* setting The *repr()* of unnamed zero-valued flags has changed. It is now: @@ -621,8 +621,8 @@ Data Types :class:`!ReprEnum` uses the :meth:`repr() ` of :class:`Enum`, but the :class:`str() ` of the mixed-in data type: - * :meth:`!int.__str__` for :class:`IntEnum` and :class:`IntFlag` - * :meth:`!str.__str__` for :class:`StrEnum` + * :meth:`!int.__str__` for :class:`IntEnum` and :class:`IntFlag` + * :meth:`!str.__str__` for :class:`StrEnum` Inherit from :class:`!ReprEnum` to keep the :class:`str() ` / :func:`format` of the mixed-in data type instead of using the @@ -781,13 +781,13 @@ Supported ``_sunder_`` names - ``_generate_next_value_`` -- used to get an appropriate value for an enum member; may be overridden - .. note:: + .. note:: - For standard :class:`Enum` classes the next value chosen is the last value seen - incremented by one. + For standard :class:`Enum` classes the next value chosen is the last value seen + incremented by one. - For :class:`Flag` classes the next value chosen will be the next highest - power-of-two, regardless of the last value seen. + For :class:`Flag` classes the next value chosen will be the next highest + power-of-two, regardless of the last value seen. .. versionadded:: 3.6 ``_missing_``, ``_order_``, ``_generate_next_value_`` .. versionadded:: 3.7 ``_ignore_`` @@ -809,11 +809,11 @@ Utilities and Decorators *auto* instances are only resolved when at the top level of an assignment: - * ``FIRST = auto()`` will work (auto() is replaced with ``1``); - * ``SECOND = auto(), -2`` will work (auto is replaced with ``2``, so ``2, -2`` is - used to create the ``SECOND`` enum member; - * ``THREE = [auto(), -3]`` will *not* work (``, -3`` is used to - create the ``THREE`` enum member) + * ``FIRST = auto()`` will work (auto() is replaced with ``1``); + * ``SECOND = auto(), -2`` will work (auto is replaced with ``2``, so ``2, -2`` is + used to create the ``SECOND`` enum member; + * ``THREE = [auto(), -3]`` will *not* work (``, -3`` is used to + create the ``THREE`` enum member) .. versionchanged:: 3.11.1 diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst index 96cf5940d432c9..5e5fd467058edf 100644 --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -1157,8 +1157,8 @@ are always available. They are listed here in alphabetical order. See also :func:`format` for more information. - .. index:: - single: file object; open() built-in function +.. index:: + single: file object; open() built-in function .. function:: open(file, mode='r', buffering=-1, encoding=None, errors=None, newline=None, closefd=True, opener=None) @@ -1359,28 +1359,28 @@ are always available. They are listed here in alphabetical order. .. versionchanged:: 3.3 - * The *opener* parameter was added. - * The ``'x'`` mode was added. - * :exc:`IOError` used to be raised, it is now an alias of :exc:`OSError`. - * :exc:`FileExistsError` is now raised if the file opened in exclusive - creation mode (``'x'``) already exists. + * The *opener* parameter was added. + * The ``'x'`` mode was added. + * :exc:`IOError` used to be raised, it is now an alias of :exc:`OSError`. + * :exc:`FileExistsError` is now raised if the file opened in exclusive + creation mode (``'x'``) already exists. .. versionchanged:: 3.4 - * The file is now non-inheritable. + * The file is now non-inheritable. .. versionchanged:: 3.5 - * If the system call is interrupted and the signal handler does not raise an - exception, the function now retries the system call instead of raising an - :exc:`InterruptedError` exception (see :pep:`475` for the rationale). - * The ``'namereplace'`` error handler was added. + * If the system call is interrupted and the signal handler does not raise an + exception, the function now retries the system call instead of raising an + :exc:`InterruptedError` exception (see :pep:`475` for the rationale). + * The ``'namereplace'`` error handler was added. .. versionchanged:: 3.6 - * Support added to accept objects implementing :class:`os.PathLike`. - * On Windows, opening a console buffer may return a subclass of - :class:`io.RawIOBase` other than :class:`io.FileIO`. + * Support added to accept objects implementing :class:`os.PathLike`. + * On Windows, opening a console buffer may return a subclass of + :class:`io.RawIOBase` other than :class:`io.FileIO`. .. versionchanged:: 3.11 The ``'U'`` mode has been removed. diff --git a/Doc/library/graphlib.rst b/Doc/library/graphlib.rst index fdd8f39ef4e1c4..5414d6370b78ce 100644 --- a/Doc/library/graphlib.rst +++ b/Doc/library/graphlib.rst @@ -37,14 +37,14 @@ In the general case, the steps required to perform the sorting of a given graph are as follows: - * Create an instance of the :class:`TopologicalSorter` with an optional - initial graph. - * Add additional nodes to the graph. - * Call :meth:`~TopologicalSorter.prepare` on the graph. - * While :meth:`~TopologicalSorter.is_active` is ``True``, iterate over - the nodes returned by :meth:`~TopologicalSorter.get_ready` and - process them. Call :meth:`~TopologicalSorter.done` on each node as it - finishes processing. + * Create an instance of the :class:`TopologicalSorter` with an optional + initial graph. + * Add additional nodes to the graph. + * Call :meth:`~TopologicalSorter.prepare` on the graph. + * While :meth:`~TopologicalSorter.is_active` is ``True``, iterate over + the nodes returned by :meth:`~TopologicalSorter.get_ready` and + process them. Call :meth:`~TopologicalSorter.done` on each node as it + finishes processing. In case just an immediate sorting of the nodes in the graph is required and no parallelism is involved, the convenience method diff --git a/Doc/library/idle.rst b/Doc/library/idle.rst index 3211da50dc745c..e710d0bacf3fee 100644 --- a/Doc/library/idle.rst +++ b/Doc/library/idle.rst @@ -439,24 +439,24 @@ the :kbd:`Command` key on macOS. * Some useful Emacs bindings are inherited from Tcl/Tk: - * :kbd:`C-a` beginning of line + * :kbd:`C-a` beginning of line - * :kbd:`C-e` end of line + * :kbd:`C-e` end of line - * :kbd:`C-k` kill line (but doesn't put it in clipboard) + * :kbd:`C-k` kill line (but doesn't put it in clipboard) - * :kbd:`C-l` center window around the insertion point + * :kbd:`C-l` center window around the insertion point - * :kbd:`C-b` go backward one character without deleting (usually you can - also use the cursor key for this) + * :kbd:`C-b` go backward one character without deleting (usually you can + also use the cursor key for this) - * :kbd:`C-f` go forward one character without deleting (usually you can - also use the cursor key for this) + * :kbd:`C-f` go forward one character without deleting (usually you can + also use the cursor key for this) - * :kbd:`C-p` go up one line (usually you can also use the cursor key for - this) + * :kbd:`C-p` go up one line (usually you can also use the cursor key for + this) - * :kbd:`C-d` delete next character + * :kbd:`C-d` delete next character Standard keybindings (like :kbd:`C-c` to copy and :kbd:`C-v` to paste) may work. Keybindings are selected in the Configure IDLE dialog. diff --git a/Doc/library/inspect.rst b/Doc/library/inspect.rst index 8b2bfb322529b1..22637a051d9fa6 100644 --- a/Doc/library/inspect.rst +++ b/Doc/library/inspect.rst @@ -1435,10 +1435,11 @@ generator to be determined easily. Get current state of a generator-iterator. Possible states are: - * GEN_CREATED: Waiting to start execution. - * GEN_RUNNING: Currently being executed by the interpreter. - * GEN_SUSPENDED: Currently suspended at a yield expression. - * GEN_CLOSED: Execution has completed. + + * GEN_CREATED: Waiting to start execution. + * GEN_RUNNING: Currently being executed by the interpreter. + * GEN_SUSPENDED: Currently suspended at a yield expression. + * GEN_CLOSED: Execution has completed. .. versionadded:: 3.2 @@ -1450,10 +1451,11 @@ generator to be determined easily. ``cr_frame`` attributes. Possible states are: - * CORO_CREATED: Waiting to start execution. - * CORO_RUNNING: Currently being executed by the interpreter. - * CORO_SUSPENDED: Currently suspended at an await expression. - * CORO_CLOSED: Execution has completed. + + * CORO_CREATED: Waiting to start execution. + * CORO_RUNNING: Currently being executed by the interpreter. + * CORO_SUSPENDED: Currently suspended at an await expression. + * CORO_CLOSED: Execution has completed. .. versionadded:: 3.5 diff --git a/Doc/library/io.rst b/Doc/library/io.rst index 01088879218cb4..6736aa9ee2b0ef 100644 --- a/Doc/library/io.rst +++ b/Doc/library/io.rst @@ -253,12 +253,12 @@ The implementation of I/O streams is organized as a hierarchy of classes. First specify the various categories of streams, then concrete classes providing the standard stream implementations. - .. note:: +.. note:: - The abstract base classes also provide default implementations of some - methods in order to help implementation of concrete stream classes. For - example, :class:`BufferedIOBase` provides unoptimized implementations of - :meth:`!readinto` and :meth:`!readline`. + The abstract base classes also provide default implementations of some + methods in order to help implementation of concrete stream classes. For + example, :class:`BufferedIOBase` provides unoptimized implementations of + :meth:`!readinto` and :meth:`!readline`. At the top of the I/O hierarchy is the abstract base class :class:`IOBase`. It defines the basic interface to a stream. Note, however, that there is no diff --git a/Doc/library/logging.config.rst b/Doc/library/logging.config.rst index 619e0406066809..45aa67a2e49f62 100644 --- a/Doc/library/logging.config.rst +++ b/Doc/library/logging.config.rst @@ -257,10 +257,10 @@ otherwise, the context is used to determine what to instantiate. which correspond to the arguments passed to create a :class:`~logging.Formatter` object: - * ``format`` - * ``datefmt`` - * ``style`` - * ``validate`` (since version >=3.8) + * ``format`` + * ``datefmt`` + * ``style`` + * ``validate`` (since version >=3.8) An optional ``class`` key indicates the name of the formatter's class (as a dotted module and class name). The instantiation @@ -543,9 +543,9 @@ valid keyword parameter name, and so will not clash with the names of the keyword arguments used in the call. The ``'()'`` also serves as a mnemonic that the corresponding value is a callable. - .. versionchanged:: 3.11 - The ``filters`` member of ``handlers`` and ``loggers`` can take - filter instances in addition to ids. +.. versionchanged:: 3.11 + The ``filters`` member of ``handlers`` and ``loggers`` can take + filter instances in addition to ids. You can also specify a special key ``'.'`` whose value is a dictionary is a mapping of attribute names to values. If found, the specified attributes will diff --git a/Doc/library/lzma.rst b/Doc/library/lzma.rst index 434e7ac9061186..0d69c3bc01d1e2 100644 --- a/Doc/library/lzma.rst +++ b/Doc/library/lzma.rst @@ -333,19 +333,22 @@ the key ``"id"``, and may contain additional keys to specify filter-dependent options. Valid filter IDs are as follows: * Compression filters: - * :const:`FILTER_LZMA1` (for use with :const:`FORMAT_ALONE`) - * :const:`FILTER_LZMA2` (for use with :const:`FORMAT_XZ` and :const:`FORMAT_RAW`) + + * :const:`FILTER_LZMA1` (for use with :const:`FORMAT_ALONE`) + * :const:`FILTER_LZMA2` (for use with :const:`FORMAT_XZ` and :const:`FORMAT_RAW`) * Delta filter: - * :const:`FILTER_DELTA` + + * :const:`FILTER_DELTA` * Branch-Call-Jump (BCJ) filters: - * :const:`FILTER_X86` - * :const:`FILTER_IA64` - * :const:`FILTER_ARM` - * :const:`FILTER_ARMTHUMB` - * :const:`FILTER_POWERPC` - * :const:`FILTER_SPARC` + + * :const:`FILTER_X86` + * :const:`FILTER_IA64` + * :const:`FILTER_ARM` + * :const:`FILTER_ARMTHUMB` + * :const:`FILTER_POWERPC` + * :const:`FILTER_SPARC` A filter chain can consist of up to 4 filters, and cannot be empty. The last filter in the chain must be a compression filter, and any other filters must be @@ -354,21 +357,21 @@ delta or BCJ filters. Compression filters support the following options (specified as additional entries in the dictionary representing the filter): - * ``preset``: A compression preset to use as a source of default values for - options that are not specified explicitly. - * ``dict_size``: Dictionary size in bytes. This should be between 4 KiB and - 1.5 GiB (inclusive). - * ``lc``: Number of literal context bits. - * ``lp``: Number of literal position bits. The sum ``lc + lp`` must be at - most 4. - * ``pb``: Number of position bits; must be at most 4. - * ``mode``: :const:`MODE_FAST` or :const:`MODE_NORMAL`. - * ``nice_len``: What should be considered a "nice length" for a match. - This should be 273 or less. - * ``mf``: What match finder to use -- :const:`MF_HC3`, :const:`MF_HC4`, - :const:`MF_BT2`, :const:`MF_BT3`, or :const:`MF_BT4`. - * ``depth``: Maximum search depth used by match finder. 0 (default) means to - select automatically based on other filter options. +* ``preset``: A compression preset to use as a source of default values for + options that are not specified explicitly. +* ``dict_size``: Dictionary size in bytes. This should be between 4 KiB and + 1.5 GiB (inclusive). +* ``lc``: Number of literal context bits. +* ``lp``: Number of literal position bits. The sum ``lc + lp`` must be at + most 4. +* ``pb``: Number of position bits; must be at most 4. +* ``mode``: :const:`MODE_FAST` or :const:`MODE_NORMAL`. +* ``nice_len``: What should be considered a "nice length" for a match. + This should be 273 or less. +* ``mf``: What match finder to use -- :const:`MF_HC3`, :const:`MF_HC4`, + :const:`MF_BT2`, :const:`MF_BT3`, or :const:`MF_BT4`. +* ``depth``: Maximum search depth used by match finder. 0 (default) means to + select automatically based on other filter options. The delta filter stores the differences between bytes, producing more repetitive input for the compressor in certain circumstances. It supports one option, diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst index 236b408499e235..2162328be9adb2 100644 --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -2751,20 +2751,20 @@ worker threads rather than worker processes. Unlike :class:`Pool`, *maxtasksperchild* and *context* cannot be provided. - .. note:: - - A :class:`ThreadPool` shares the same interface as :class:`Pool`, which - is designed around a pool of processes and predates the introduction of - the :class:`concurrent.futures` module. As such, it inherits some - operations that don't make sense for a pool backed by threads, and it - has its own type for representing the status of asynchronous jobs, - :class:`AsyncResult`, that is not understood by any other libraries. - - Users should generally prefer to use - :class:`concurrent.futures.ThreadPoolExecutor`, which has a simpler - interface that was designed around threads from the start, and which - returns :class:`concurrent.futures.Future` instances that are - compatible with many other libraries, including :mod:`asyncio`. + .. note:: + + A :class:`ThreadPool` shares the same interface as :class:`Pool`, which + is designed around a pool of processes and predates the introduction of + the :class:`concurrent.futures` module. As such, it inherits some + operations that don't make sense for a pool backed by threads, and it + has its own type for representing the status of asynchronous jobs, + :class:`AsyncResult`, that is not understood by any other libraries. + + Users should generally prefer to use + :class:`concurrent.futures.ThreadPoolExecutor`, which has a simpler + interface that was designed around threads from the start, and which + returns :class:`concurrent.futures.Future` instances that are + compatible with many other libraries, including :mod:`asyncio`. .. _multiprocessing-programming: diff --git a/Doc/library/numbers.rst b/Doc/library/numbers.rst index b3dce151aee289..2a05b56db051f9 100644 --- a/Doc/library/numbers.rst +++ b/Doc/library/numbers.rst @@ -160,23 +160,23 @@ refer to ``MyIntegral`` and ``OtherTypeIKnowAbout`` as of :class:`Complex` (``a : A <: Complex``), and ``b : B <: Complex``. I'll consider ``a + b``: - 1. If ``A`` defines an :meth:`__add__` which accepts ``b``, all is - well. - 2. If ``A`` falls back to the boilerplate code, and it were to - return a value from :meth:`__add__`, we'd miss the possibility - that ``B`` defines a more intelligent :meth:`__radd__`, so the - boilerplate should return :const:`NotImplemented` from - :meth:`__add__`. (Or ``A`` may not implement :meth:`__add__` at - all.) - 3. Then ``B``'s :meth:`__radd__` gets a chance. If it accepts - ``a``, all is well. - 4. If it falls back to the boilerplate, there are no more possible - methods to try, so this is where the default implementation - should live. - 5. If ``B <: A``, Python tries ``B.__radd__`` before - ``A.__add__``. This is ok, because it was implemented with - knowledge of ``A``, so it can handle those instances before - delegating to :class:`Complex`. +1. If ``A`` defines an :meth:`__add__` which accepts ``b``, all is + well. +2. If ``A`` falls back to the boilerplate code, and it were to + return a value from :meth:`__add__`, we'd miss the possibility + that ``B`` defines a more intelligent :meth:`__radd__`, so the + boilerplate should return :const:`NotImplemented` from + :meth:`__add__`. (Or ``A`` may not implement :meth:`__add__` at + all.) +3. Then ``B``'s :meth:`__radd__` gets a chance. If it accepts + ``a``, all is well. +4. If it falls back to the boilerplate, there are no more possible + methods to try, so this is where the default implementation + should live. +5. If ``B <: A``, Python tries ``B.__radd__`` before + ``A.__add__``. This is ok, because it was implemented with + knowledge of ``A``, so it can handle those instances before + delegating to :class:`Complex`. If ``A <: Complex`` and ``B <: Real`` without sharing any other knowledge, then the appropriate shared operation is the one involving the built diff --git a/Doc/library/profile.rst b/Doc/library/profile.rst index 69274b0c354a25..4c60a1e0d781b0 100644 --- a/Doc/library/profile.rst +++ b/Doc/library/profile.rst @@ -135,11 +135,11 @@ the output by. This only applies when ``-o`` is not supplied. ``-m`` specifies that a module is being profiled instead of a script. - .. versionadded:: 3.7 - Added the ``-m`` option to :mod:`cProfile`. +.. versionadded:: 3.7 + Added the ``-m`` option to :mod:`cProfile`. - .. versionadded:: 3.8 - Added the ``-m`` option to :mod:`profile`. +.. versionadded:: 3.8 + Added the ``-m`` option to :mod:`profile`. The :mod:`pstats` module's :class:`~pstats.Stats` class has a variety of methods for manipulating and printing the data saved into a profile results file:: diff --git a/Doc/library/re.rst b/Doc/library/re.rst index 5bea6166a1e4ef..83fff79d2884e5 100644 --- a/Doc/library/re.rst +++ b/Doc/library/re.rst @@ -176,7 +176,7 @@ The special characters are: ``x*+``, ``x++`` and ``x?+`` are equivalent to ``(?>x*)``, ``(?>x+)`` and ``(?>x?)`` correspondingly. - .. versionadded:: 3.11 + .. versionadded:: 3.11 .. index:: single: {} (curly brackets); in regular expressions diff --git a/Doc/library/shelve.rst b/Doc/library/shelve.rst index 01314f491f47a7..219219af6fd87f 100644 --- a/Doc/library/shelve.rst +++ b/Doc/library/shelve.rst @@ -94,9 +94,9 @@ Two additional methods are supported: Restrictions ------------ - .. index:: - pair: module; dbm.ndbm - pair: module; dbm.gnu +.. index:: + pair: module; dbm.ndbm + pair: module; dbm.gnu * The choice of which database package will be used (such as :mod:`dbm.ndbm` or :mod:`dbm.gnu`) depends on which interface is available. Therefore it is not diff --git a/Doc/library/socket.rst b/Doc/library/socket.rst index 32afacbbd231a7..c7274f59f43d1a 100644 --- a/Doc/library/socket.rst +++ b/Doc/library/socket.rst @@ -204,14 +204,14 @@ created. Socket addresses are represented as follows: - *addr* - Optional bytes-like object specifying the hardware physical address, whose interpretation depends on the device. - .. availability:: Linux >= 2.2. + .. availability:: Linux >= 2.2. - :const:`AF_QIPCRTR` is a Linux-only socket based interface for communicating with services running on co-processors in Qualcomm platforms. The address family is represented as a ``(node, port)`` tuple where the *node* and *port* are non-negative integers. - .. availability:: Linux >= 4.7. + .. availability:: Linux >= 4.7. .. versionadded:: 3.8 diff --git a/Doc/library/sqlite3.rst b/Doc/library/sqlite3.rst index c85f4fc49c070e..ef92bad596ef15 100644 --- a/Doc/library/sqlite3.rst +++ b/Doc/library/sqlite3.rst @@ -235,11 +235,11 @@ inserted data and retrieved values from it in multiple ways. * :ref:`sqlite3-howtos` for further reading: - * :ref:`sqlite3-placeholders` - * :ref:`sqlite3-adapters` - * :ref:`sqlite3-converters` - * :ref:`sqlite3-connection-context-manager` - * :ref:`sqlite3-howto-row-factory` + * :ref:`sqlite3-placeholders` + * :ref:`sqlite3-adapters` + * :ref:`sqlite3-converters` + * :ref:`sqlite3-connection-context-manager` + * :ref:`sqlite3-howto-row-factory` * :ref:`sqlite3-explanation` for in-depth background on transaction control. @@ -498,13 +498,13 @@ Module constants the default `threading mode `_ the underlying SQLite library is compiled with. The SQLite threading modes are: - 1. **Single-thread**: In this mode, all mutexes are disabled and SQLite is - unsafe to use in more than a single thread at once. - 2. **Multi-thread**: In this mode, SQLite can be safely used by multiple - threads provided that no single database connection is used - simultaneously in two or more threads. - 3. **Serialized**: In serialized mode, SQLite can be safely used by - multiple threads with no restriction. + 1. **Single-thread**: In this mode, all mutexes are disabled and SQLite is + unsafe to use in more than a single thread at once. + 2. **Multi-thread**: In this mode, SQLite can be safely used by multiple + threads provided that no single database connection is used + simultaneously in two or more threads. + 3. **Serialized**: In serialized mode, SQLite can be safely used by + multiple threads with no restriction. The mappings from SQLite threading modes to DB-API 2.0 threadsafety levels are as follows: diff --git a/Doc/library/ssl.rst b/Doc/library/ssl.rst index 25369969f544b5..e7694d70163f76 100644 --- a/Doc/library/ssl.rst +++ b/Doc/library/ssl.rst @@ -1474,18 +1474,18 @@ to speed up repeated connections from the same clients. Here's a table showing which versions in a client (down the side) can connect to which versions in a server (along the top): - .. table:: - - ======================== ============ ============ ============= ========= =========== =========== - *client* / **server** **SSLv2** **SSLv3** **TLS** [3]_ **TLSv1** **TLSv1.1** **TLSv1.2** - ------------------------ ------------ ------------ ------------- --------- ----------- ----------- - *SSLv2* yes no no [1]_ no no no - *SSLv3* no yes no [2]_ no no no - *TLS* (*SSLv23*) [3]_ no [1]_ no [2]_ yes yes yes yes - *TLSv1* no no yes yes no no - *TLSv1.1* no no yes no yes no - *TLSv1.2* no no yes no no yes - ======================== ============ ============ ============= ========= =========== =========== + .. table:: + + ======================== ============ ============ ============= ========= =========== =========== + *client* / **server** **SSLv2** **SSLv3** **TLS** [3]_ **TLSv1** **TLSv1.1** **TLSv1.2** + ------------------------ ------------ ------------ ------------- --------- ----------- ----------- + *SSLv2* yes no no [1]_ no no no + *SSLv3* no yes no [2]_ no no no + *TLS* (*SSLv23*) [3]_ no [1]_ no [2]_ yes yes yes yes + *TLSv1* no no yes yes no no + *TLSv1.1* no no yes no yes no + *TLSv1.2* no no yes no no yes + ======================== ============ ============ ============= ========= =========== =========== .. rubric:: Footnotes .. [1] :class:`SSLContext` disables SSLv2 with :data:`OP_NO_SSLv2` by default. diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst index 6ba2dd0930ed7d..e9ecd7821ec837 100644 --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -48,9 +48,9 @@ By default, an object is considered true unless its class defines either a returns zero, when called with the object. [1]_ Here are most of the built-in objects considered false: - .. index:: - single: None (Built-in object) - single: False (Built-in object) +.. index:: + single: None (Built-in object) + single: False (Built-in object) * constants defined to be false: ``None`` and ``False`` diff --git a/Doc/library/string.rst b/Doc/library/string.rst index b2ea9c1781b870..5fbd82a5ec852e 100644 --- a/Doc/library/string.rst +++ b/Doc/library/string.rst @@ -206,15 +206,15 @@ literal text, it can be escaped by doubling: ``{{`` and ``}}``. The grammar for a replacement field is as follows: - .. productionlist:: format-string - replacement_field: "{" [`field_name`] ["!" `conversion`] [":" `format_spec`] "}" - field_name: arg_name ("." `attribute_name` | "[" `element_index` "]")* - arg_name: [`identifier` | `digit`+] - attribute_name: `identifier` - element_index: `digit`+ | `index_string` - index_string: + - conversion: "r" | "s" | "a" - format_spec: +.. productionlist:: format-string + replacement_field: "{" [`field_name`] ["!" `conversion`] [":" `format_spec`] "}" + field_name: arg_name ("." `attribute_name` | "[" `element_index` "]")* + arg_name: [`identifier` | `digit`+] + attribute_name: `identifier` + element_index: `digit`+ | `index_string` + index_string: + + conversion: "r" | "s" | "a" + format_spec: In less formal terms, the replacement field can start with a *field_name* that specifies the object whose value is to be formatted and inserted @@ -332,30 +332,30 @@ affect the :func:`format` function. The meaning of the various alignment options is as follows: - .. index:: - single: < (less); in string formatting - single: > (greater); in string formatting - single: = (equals); in string formatting - single: ^ (caret); in string formatting - - +---------+----------------------------------------------------------+ - | Option | Meaning | - +=========+==========================================================+ - | ``'<'`` | Forces the field to be left-aligned within the available | - | | space (this is the default for most objects). | - +---------+----------------------------------------------------------+ - | ``'>'`` | Forces the field to be right-aligned within the | - | | available space (this is the default for numbers). | - +---------+----------------------------------------------------------+ - | ``'='`` | Forces the padding to be placed after the sign (if any) | - | | but before the digits. This is used for printing fields | - | | in the form '+000000120'. This alignment option is only | - | | valid for numeric types. It becomes the default for | - | | numbers when '0' immediately precedes the field width. | - +---------+----------------------------------------------------------+ - | ``'^'`` | Forces the field to be centered within the available | - | | space. | - +---------+----------------------------------------------------------+ +.. index:: + single: < (less); in string formatting + single: > (greater); in string formatting + single: = (equals); in string formatting + single: ^ (caret); in string formatting + ++---------+----------------------------------------------------------+ +| Option | Meaning | ++=========+==========================================================+ +| ``'<'`` | Forces the field to be left-aligned within the available | +| | space (this is the default for most objects). | ++---------+----------------------------------------------------------+ +| ``'>'`` | Forces the field to be right-aligned within the | +| | available space (this is the default for numbers). | ++---------+----------------------------------------------------------+ +| ``'='`` | Forces the padding to be placed after the sign (if any) | +| | but before the digits. This is used for printing fields | +| | in the form '+000000120'. This alignment option is only | +| | valid for numeric types. It becomes the default for | +| | numbers when '0' immediately precedes the field width. | ++---------+----------------------------------------------------------+ +| ``'^'`` | Forces the field to be centered within the available | +| | space. | ++---------+----------------------------------------------------------+ Note that unless a minimum field width is defined, the field width will always be the same size as the data to fill it, so that the alignment option has no @@ -364,23 +364,23 @@ meaning in this case. The *sign* option is only valid for number types, and can be one of the following: - .. index:: - single: + (plus); in string formatting - single: - (minus); in string formatting - single: space; in string formatting - - +---------+----------------------------------------------------------+ - | Option | Meaning | - +=========+==========================================================+ - | ``'+'`` | indicates that a sign should be used for both | - | | positive as well as negative numbers. | - +---------+----------------------------------------------------------+ - | ``'-'`` | indicates that a sign should be used only for negative | - | | numbers (this is the default behavior). | - +---------+----------------------------------------------------------+ - | space | indicates that a leading space should be used on | - | | positive numbers, and a minus sign on negative numbers. | - +---------+----------------------------------------------------------+ +.. index:: + single: + (plus); in string formatting + single: - (minus); in string formatting + single: space; in string formatting + ++---------+----------------------------------------------------------+ +| Option | Meaning | ++=========+==========================================================+ +| ``'+'`` | indicates that a sign should be used for both | +| | positive as well as negative numbers. | ++---------+----------------------------------------------------------+ +| ``'-'`` | indicates that a sign should be used only for negative | +| | numbers (this is the default behavior). | ++---------+----------------------------------------------------------+ +| space | indicates that a leading space should be used on | +| | positive numbers, and a minus sign on negative numbers. | ++---------+----------------------------------------------------------+ .. index:: single: z; in string formatting diff --git a/Doc/library/subprocess.rst b/Doc/library/subprocess.rst index 377f6938701da2..e5e992c8c87fd4 100644 --- a/Doc/library/subprocess.rst +++ b/Doc/library/subprocess.rst @@ -668,18 +668,18 @@ functions. passed to the underlying ``CreateProcess`` function. *creationflags*, if given, can be one or more of the following flags: - * :data:`CREATE_NEW_CONSOLE` - * :data:`CREATE_NEW_PROCESS_GROUP` - * :data:`ABOVE_NORMAL_PRIORITY_CLASS` - * :data:`BELOW_NORMAL_PRIORITY_CLASS` - * :data:`HIGH_PRIORITY_CLASS` - * :data:`IDLE_PRIORITY_CLASS` - * :data:`NORMAL_PRIORITY_CLASS` - * :data:`REALTIME_PRIORITY_CLASS` - * :data:`CREATE_NO_WINDOW` - * :data:`DETACHED_PROCESS` - * :data:`CREATE_DEFAULT_ERROR_MODE` - * :data:`CREATE_BREAKAWAY_FROM_JOB` + * :data:`CREATE_NEW_CONSOLE` + * :data:`CREATE_NEW_PROCESS_GROUP` + * :data:`ABOVE_NORMAL_PRIORITY_CLASS` + * :data:`BELOW_NORMAL_PRIORITY_CLASS` + * :data:`HIGH_PRIORITY_CLASS` + * :data:`IDLE_PRIORITY_CLASS` + * :data:`NORMAL_PRIORITY_CLASS` + * :data:`REALTIME_PRIORITY_CLASS` + * :data:`CREATE_NO_WINDOW` + * :data:`DETACHED_PROCESS` + * :data:`CREATE_DEFAULT_ERROR_MODE` + * :data:`CREATE_BREAKAWAY_FROM_JOB` *pipesize* can be used to change the size of the pipe when :data:`PIPE` is used for *stdin*, *stdout* or *stderr*. The size of the pipe @@ -744,8 +744,8 @@ the timeout expires before the process exits. Exceptions defined in this module all inherit from :exc:`SubprocessError`. - .. versionadded:: 3.3 - The :exc:`SubprocessError` base class was added. +.. versionadded:: 3.3 + The :exc:`SubprocessError` base class was added. .. _subprocess-security: diff --git a/Doc/library/tkinter.rst b/Doc/library/tkinter.rst index 246abf374b0219..ee34f2659cf3d8 100644 --- a/Doc/library/tkinter.rst +++ b/Doc/library/tkinter.rst @@ -533,24 +533,24 @@ interpreter will fail. A number of special cases exist: - * Tcl/Tk libraries can be built so they are not thread-aware. In this case, - :mod:`tkinter` calls the library from the originating Python thread, even - if this is different than the thread that created the Tcl interpreter. A global - lock ensures only one call occurs at a time. - - * While :mod:`tkinter` allows you to create more than one instance of a :class:`Tk` - object (with its own interpreter), all interpreters that are part of the same - thread share a common event queue, which gets ugly fast. In practice, don't create - more than one instance of :class:`Tk` at a time. Otherwise, it's best to create - them in separate threads and ensure you're running a thread-aware Tcl/Tk build. - - * Blocking event handlers are not the only way to prevent the Tcl interpreter from - reentering the event loop. It is even possible to run multiple nested event loops - or abandon the event loop entirely. If you're doing anything tricky when it comes - to events or threads, be aware of these possibilities. - - * There are a few select :mod:`tkinter` functions that presently work only when - called from the thread that created the Tcl interpreter. +* Tcl/Tk libraries can be built so they are not thread-aware. In this case, + :mod:`tkinter` calls the library from the originating Python thread, even + if this is different than the thread that created the Tcl interpreter. A global + lock ensures only one call occurs at a time. + +* While :mod:`tkinter` allows you to create more than one instance of a :class:`Tk` + object (with its own interpreter), all interpreters that are part of the same + thread share a common event queue, which gets ugly fast. In practice, don't create + more than one instance of :class:`Tk` at a time. Otherwise, it's best to create + them in separate threads and ensure you're running a thread-aware Tcl/Tk build. + +* Blocking event handlers are not the only way to prevent the Tcl interpreter from + reentering the event loop. It is even possible to run multiple nested event loops + or abandon the event loop entirely. If you're doing anything tricky when it comes + to events or threads, be aware of these possibilities. + +* There are a few select :mod:`tkinter` functions that presently work only when + called from the thread that created the Tcl interpreter. Handy Reference diff --git a/Doc/library/tkinter.ttk.rst b/Doc/library/tkinter.ttk.rst index 9f2f9eb858afd4..dc31a1a4c1850a 100644 --- a/Doc/library/tkinter.ttk.rst +++ b/Doc/library/tkinter.ttk.rst @@ -104,33 +104,33 @@ Standard Options All the :mod:`ttk` Widgets accept the following options: - .. tabularcolumns:: |l|L| - - +-----------+--------------------------------------------------------------+ - | Option | Description | - +===========+==============================================================+ - | class | Specifies the window class. The class is used when querying | - | | the option database for the window's other options, to | - | | determine the default bindtags for the window, and to select | - | | the widget's default layout and style. This option is | - | | read-only, and may only be specified when the window is | - | | created. | - +-----------+--------------------------------------------------------------+ - | cursor | Specifies the mouse cursor to be used for the widget. If set | - | | to the empty string (the default), the cursor is inherited | - | | for the parent widget. | - +-----------+--------------------------------------------------------------+ - | takefocus | Determines whether the window accepts the focus during | - | | keyboard traversal. 0, 1 or an empty string is returned. | - | | If 0 is returned, it means that the window should be skipped | - | | entirely during keyboard traversal. If 1, it means that the | - | | window should receive the input focus as long as it is | - | | viewable. And an empty string means that the traversal | - | | scripts make the decision about whether or not to focus | - | | on the window. | - +-----------+--------------------------------------------------------------+ - | style | May be used to specify a custom widget style. | - +-----------+--------------------------------------------------------------+ +.. tabularcolumns:: |l|L| + ++-----------+--------------------------------------------------------------+ +| Option | Description | ++===========+==============================================================+ +| class | Specifies the window class. The class is used when querying | +| | the option database for the window's other options, to | +| | determine the default bindtags for the window, and to select | +| | the widget's default layout and style. This option is | +| | read-only, and may only be specified when the window is | +| | created. | ++-----------+--------------------------------------------------------------+ +| cursor | Specifies the mouse cursor to be used for the widget. If set | +| | to the empty string (the default), the cursor is inherited | +| | for the parent widget. | ++-----------+--------------------------------------------------------------+ +| takefocus | Determines whether the window accepts the focus during | +| | keyboard traversal. 0, 1 or an empty string is returned. | +| | If 0 is returned, it means that the window should be skipped | +| | entirely during keyboard traversal. If 1, it means that the | +| | window should receive the input focus as long as it is | +| | viewable. And an empty string means that the traversal | +| | scripts make the decision about whether or not to focus | +| | on the window. | ++-----------+--------------------------------------------------------------+ +| style | May be used to specify a custom widget style. | ++-----------+--------------------------------------------------------------+ Scrollable Widget Options @@ -139,24 +139,24 @@ Scrollable Widget Options The following options are supported by widgets that are controlled by a scrollbar. - .. tabularcolumns:: |l|L| - - +----------------+---------------------------------------------------------+ - | Option | Description | - +================+=========================================================+ - | xscrollcommand | Used to communicate with horizontal scrollbars. | - | | | - | | When the view in the widget's window change, the widget | - | | will generate a Tcl command based on the scrollcommand. | - | | | - | | Usually this option consists of the method | - | | :meth:`Scrollbar.set` of some scrollbar. This will cause| - | | the scrollbar to be updated whenever the view in the | - | | window changes. | - +----------------+---------------------------------------------------------+ - | yscrollcommand | Used to communicate with vertical scrollbars. | - | | For some more information, see above. | - +----------------+---------------------------------------------------------+ +.. tabularcolumns:: |l|L| + ++----------------+---------------------------------------------------------+ +| Option | Description | ++================+=========================================================+ +| xscrollcommand | Used to communicate with horizontal scrollbars. | +| | | +| | When the view in the widget's window change, the widget | +| | will generate a Tcl command based on the scrollcommand. | +| | | +| | Usually this option consists of the method | +| | :meth:`Scrollbar.set` of some scrollbar. This will cause| +| | the scrollbar to be updated whenever the view in the | +| | window changes. | ++----------------+---------------------------------------------------------+ +| yscrollcommand | Used to communicate with vertical scrollbars. | +| | For some more information, see above. | ++----------------+---------------------------------------------------------+ Label Options @@ -165,93 +165,93 @@ Label Options The following options are supported by labels, buttons and other button-like widgets. - .. tabularcolumns:: |l|p{0.7\linewidth}| - - +--------------+-----------------------------------------------------------+ - | Option | Description | - +==============+===========================================================+ - | text | Specifies a text string to be displayed inside the widget.| - +--------------+-----------------------------------------------------------+ - | textvariable | Specifies a name whose value will be used in place of the | - | | text option resource. | - +--------------+-----------------------------------------------------------+ - | underline | If set, specifies the index (0-based) of a character to | - | | underline in the text string. The underline character is | - | | used for mnemonic activation. | - +--------------+-----------------------------------------------------------+ - | image | Specifies an image to display. This is a list of 1 or more| - | | elements. The first element is the default image name. The| - | | rest of the list if a sequence of statespec/value pairs as| - | | defined by :meth:`Style.map`, specifying different images | - | | to use when the widget is in a particular state or a | - | | combination of states. All images in the list should have | - | | the same size. | - +--------------+-----------------------------------------------------------+ - | compound | Specifies how to display the image relative to the text, | - | | in the case both text and images options are present. | - | | Valid values are: | - | | | - | | * text: display text only | - | | * image: display image only | - | | * top, bottom, left, right: display image above, below, | - | | left of, or right of the text, respectively. | - | | * none: the default. display the image if present, | - | | otherwise the text. | - +--------------+-----------------------------------------------------------+ - | width | If greater than zero, specifies how much space, in | - | | character widths, to allocate for the text label, if less | - | | than zero, specifies a minimum width. If zero or | - | | unspecified, the natural width of the text label is used. | - +--------------+-----------------------------------------------------------+ +.. tabularcolumns:: |l|p{0.7\linewidth}| + ++--------------+-----------------------------------------------------------+ +| Option | Description | ++==============+===========================================================+ +| text | Specifies a text string to be displayed inside the widget.| ++--------------+-----------------------------------------------------------+ +| textvariable | Specifies a name whose value will be used in place of the | +| | text option resource. | ++--------------+-----------------------------------------------------------+ +| underline | If set, specifies the index (0-based) of a character to | +| | underline in the text string. The underline character is | +| | used for mnemonic activation. | ++--------------+-----------------------------------------------------------+ +| image | Specifies an image to display. This is a list of 1 or more| +| | elements. The first element is the default image name. The| +| | rest of the list if a sequence of statespec/value pairs as| +| | defined by :meth:`Style.map`, specifying different images | +| | to use when the widget is in a particular state or a | +| | combination of states. All images in the list should have | +| | the same size. | ++--------------+-----------------------------------------------------------+ +| compound | Specifies how to display the image relative to the text, | +| | in the case both text and images options are present. | +| | Valid values are: | +| | | +| | * text: display text only | +| | * image: display image only | +| | * top, bottom, left, right: display image above, below, | +| | left of, or right of the text, respectively. | +| | * none: the default. display the image if present, | +| | otherwise the text. | ++--------------+-----------------------------------------------------------+ +| width | If greater than zero, specifies how much space, in | +| | character widths, to allocate for the text label, if less | +| | than zero, specifies a minimum width. If zero or | +| | unspecified, the natural width of the text label is used. | ++--------------+-----------------------------------------------------------+ Compatibility Options ^^^^^^^^^^^^^^^^^^^^^ - .. tabularcolumns:: |l|L| +.. tabularcolumns:: |l|L| - +--------+----------------------------------------------------------------+ - | Option | Description | - +========+================================================================+ - | state | May be set to "normal" or "disabled" to control the "disabled" | - | | state bit. This is a write-only option: setting it changes the | - | | widget state, but the :meth:`Widget.state` method does not | - | | affect this option. | - +--------+----------------------------------------------------------------+ ++--------+----------------------------------------------------------------+ +| Option | Description | ++========+================================================================+ +| state | May be set to "normal" or "disabled" to control the "disabled" | +| | state bit. This is a write-only option: setting it changes the | +| | widget state, but the :meth:`Widget.state` method does not | +| | affect this option. | ++--------+----------------------------------------------------------------+ Widget States ^^^^^^^^^^^^^ The widget state is a bitmap of independent state flags. - .. tabularcolumns:: |l|L| - - +------------+-------------------------------------------------------------+ - | Flag | Description | - +============+=============================================================+ - | active | The mouse cursor is over the widget and pressing a mouse | - | | button will cause some action to occur | - +------------+-------------------------------------------------------------+ - | disabled | Widget is disabled under program control | - +------------+-------------------------------------------------------------+ - | focus | Widget has keyboard focus | - +------------+-------------------------------------------------------------+ - | pressed | Widget is being pressed | - +------------+-------------------------------------------------------------+ - | selected | "On", "true", or "current" for things like Checkbuttons and | - | | radiobuttons | - +------------+-------------------------------------------------------------+ - | background | Windows and Mac have a notion of an "active" or foreground | - | | window. The *background* state is set for widgets in a | - | | background window, and cleared for those in the foreground | - | | window | - +------------+-------------------------------------------------------------+ - | readonly | Widget should not allow user modification | - +------------+-------------------------------------------------------------+ - | alternate | A widget-specific alternate display format | - +------------+-------------------------------------------------------------+ - | invalid | The widget's value is invalid | - +------------+-------------------------------------------------------------+ +.. tabularcolumns:: |l|L| + ++------------+-------------------------------------------------------------+ +| Flag | Description | ++============+=============================================================+ +| active | The mouse cursor is over the widget and pressing a mouse | +| | button will cause some action to occur | ++------------+-------------------------------------------------------------+ +| disabled | Widget is disabled under program control | ++------------+-------------------------------------------------------------+ +| focus | Widget has keyboard focus | ++------------+-------------------------------------------------------------+ +| pressed | Widget is being pressed | ++------------+-------------------------------------------------------------+ +| selected | "On", "true", or "current" for things like Checkbuttons and | +| | radiobuttons | ++------------+-------------------------------------------------------------+ +| background | Windows and Mac have a notion of an "active" or foreground | +| | window. The *background* state is set for widgets in a | +| | background window, and cleared for those in the foreground | +| | window | ++------------+-------------------------------------------------------------+ +| readonly | Widget should not allow user modification | ++------------+-------------------------------------------------------------+ +| alternate | A widget-specific alternate display format | ++------------+-------------------------------------------------------------+ +| invalid | The widget's value is invalid | ++------------+-------------------------------------------------------------+ A state specification is a sequence of state names, optionally prefixed with an exclamation point indicating that the bit is off. @@ -311,43 +311,43 @@ Options This widget accepts the following specific options: - .. tabularcolumns:: |l|L| - - +-----------------+--------------------------------------------------------+ - | Option | Description | - +=================+========================================================+ - | exportselection | Boolean value. If set, the widget selection is linked | - | | to the Window Manager selection (which can be returned | - | | by invoking Misc.selection_get, for example). | - +-----------------+--------------------------------------------------------+ - | justify | Specifies how the text is aligned within the widget. | - | | One of "left", "center", or "right". | - +-----------------+--------------------------------------------------------+ - | height | Specifies the height of the pop-down listbox, in rows. | - +-----------------+--------------------------------------------------------+ - | postcommand | A script (possibly registered with Misc.register) that | - | | is called immediately before displaying the values. It | - | | may specify which values to display. | - +-----------------+--------------------------------------------------------+ - | state | One of "normal", "readonly", or "disabled". In the | - | | "readonly" state, the value may not be edited directly,| - | | and the user can only selection of the values from the | - | | dropdown list. In the "normal" state, the text field is| - | | directly editable. In the "disabled" state, no | - | | interaction is possible. | - +-----------------+--------------------------------------------------------+ - | textvariable | Specifies a name whose value is linked to the widget | - | | value. Whenever the value associated with that name | - | | changes, the widget value is updated, and vice versa. | - | | See :class:`tkinter.StringVar`. | - +-----------------+--------------------------------------------------------+ - | values | Specifies the list of values to display in the | - | | drop-down listbox. | - +-----------------+--------------------------------------------------------+ - | width | Specifies an integer value indicating the desired width| - | | of the entry window, in average-size characters of the | - | | widget's font. | - +-----------------+--------------------------------------------------------+ +.. tabularcolumns:: |l|L| + ++-----------------+--------------------------------------------------------+ +| Option | Description | ++=================+========================================================+ +| exportselection | Boolean value. If set, the widget selection is linked | +| | to the Window Manager selection (which can be returned | +| | by invoking Misc.selection_get, for example). | ++-----------------+--------------------------------------------------------+ +| justify | Specifies how the text is aligned within the widget. | +| | One of "left", "center", or "right". | ++-----------------+--------------------------------------------------------+ +| height | Specifies the height of the pop-down listbox, in rows. | ++-----------------+--------------------------------------------------------+ +| postcommand | A script (possibly registered with Misc.register) that | +| | is called immediately before displaying the values. It | +| | may specify which values to display. | ++-----------------+--------------------------------------------------------+ +| state | One of "normal", "readonly", or "disabled". In the | +| | "readonly" state, the value may not be edited directly,| +| | and the user can only selection of the values from the | +| | dropdown list. In the "normal" state, the text field is| +| | directly editable. In the "disabled" state, no | +| | interaction is possible. | ++-----------------+--------------------------------------------------------+ +| textvariable | Specifies a name whose value is linked to the widget | +| | value. Whenever the value associated with that name | +| | changes, the widget value is updated, and vice versa. | +| | See :class:`tkinter.StringVar`. | ++-----------------+--------------------------------------------------------+ +| values | Specifies the list of values to display in the | +| | drop-down listbox. | ++-----------------+--------------------------------------------------------+ +| width | Specifies an integer value indicating the desired width| +| | of the entry window, in average-size characters of the | +| | widget's font. | ++-----------------+--------------------------------------------------------+ Virtual events @@ -397,7 +397,7 @@ Options This widget accepts the following specific options: - .. tabularcolumns:: |l|L| +.. tabularcolumns:: |l|L| +----------------------+------------------------------------------------------+ | Option | Description | @@ -473,25 +473,25 @@ Options This widget accepts the following specific options: - .. tabularcolumns:: |l|L| - - +---------+----------------------------------------------------------------+ - | Option | Description | - +=========+================================================================+ - | height | If present and greater than zero, specifies the desired height | - | | of the pane area (not including internal padding or tabs). | - | | Otherwise, the maximum height of all panes is used. | - +---------+----------------------------------------------------------------+ - | padding | Specifies the amount of extra space to add around the outside | - | | of the notebook. The padding is a list up to four length | - | | specifications left top right bottom. If fewer than four | - | | elements are specified, bottom defaults to top, right defaults | - | | to left, and top defaults to left. | - +---------+----------------------------------------------------------------+ - | width | If present and greater than zero, specified the desired width | - | | of the pane area (not including internal padding). Otherwise, | - | | the maximum width of all panes is used. | - +---------+----------------------------------------------------------------+ +.. tabularcolumns:: |l|L| + ++---------+----------------------------------------------------------------+ +| Option | Description | ++=========+================================================================+ +| height | If present and greater than zero, specifies the desired height | +| | of the pane area (not including internal padding or tabs). | +| | Otherwise, the maximum height of all panes is used. | ++---------+----------------------------------------------------------------+ +| padding | Specifies the amount of extra space to add around the outside | +| | of the notebook. The padding is a list up to four length | +| | specifications left top right bottom. If fewer than four | +| | elements are specified, bottom defaults to top, right defaults | +| | to left, and top defaults to left. | ++---------+----------------------------------------------------------------+ +| width | If present and greater than zero, specified the desired width | +| | of the pane area (not including internal padding). Otherwise, | +| | the maximum width of all panes is used. | ++---------+----------------------------------------------------------------+ Tab Options @@ -499,39 +499,39 @@ Tab Options There are also specific options for tabs: - .. tabularcolumns:: |l|L| - - +-----------+--------------------------------------------------------------+ - | Option | Description | - +===========+==============================================================+ - | state | Either "normal", "disabled" or "hidden". If "disabled", then | - | | the tab is not selectable. If "hidden", then the tab is not | - | | shown. | - +-----------+--------------------------------------------------------------+ - | sticky | Specifies how the child window is positioned within the pane | - | | area. Value is a string containing zero or more of the | - | | characters "n", "s", "e" or "w". Each letter refers to a | - | | side (north, south, east or west) that the child window will | - | | stick to, as per the :meth:`grid` geometry manager. | - +-----------+--------------------------------------------------------------+ - | padding | Specifies the amount of extra space to add between the | - | | notebook and this pane. Syntax is the same as for the option | - | | padding used by this widget. | - +-----------+--------------------------------------------------------------+ - | text | Specifies a text to be displayed in the tab. | - +-----------+--------------------------------------------------------------+ - | image | Specifies an image to display in the tab. See the option | - | | image described in :class:`Widget`. | - +-----------+--------------------------------------------------------------+ - | compound | Specifies how to display the image relative to the text, in | - | | the case both options text and image are present. See | - | | `Label Options`_ for legal values. | - +-----------+--------------------------------------------------------------+ - | underline | Specifies the index (0-based) of a character to underline in | - | | the text string. The underlined character is used for | - | | mnemonic activation if :meth:`Notebook.enable_traversal` is | - | | called. | - +-----------+--------------------------------------------------------------+ +.. tabularcolumns:: |l|L| + ++-----------+--------------------------------------------------------------+ +| Option | Description | ++===========+==============================================================+ +| state | Either "normal", "disabled" or "hidden". If "disabled", then | +| | the tab is not selectable. If "hidden", then the tab is not | +| | shown. | ++-----------+--------------------------------------------------------------+ +| sticky | Specifies how the child window is positioned within the pane | +| | area. Value is a string containing zero or more of the | +| | characters "n", "s", "e" or "w". Each letter refers to a | +| | side (north, south, east or west) that the child window will | +| | stick to, as per the :meth:`grid` geometry manager. | ++-----------+--------------------------------------------------------------+ +| padding | Specifies the amount of extra space to add between the | +| | notebook and this pane. Syntax is the same as for the option | +| | padding used by this widget. | ++-----------+--------------------------------------------------------------+ +| text | Specifies a text to be displayed in the tab. | ++-----------+--------------------------------------------------------------+ +| image | Specifies an image to display in the tab. See the option | +| | image described in :class:`Widget`. | ++-----------+--------------------------------------------------------------+ +| compound | Specifies how to display the image relative to the text, in | +| | the case both options text and image are present. See | +| | `Label Options`_ for legal values. | ++-----------+--------------------------------------------------------------+ +| underline | Specifies the index (0-based) of a character to underline in | +| | the text string. The underlined character is used for | +| | mnemonic activation if :meth:`Notebook.enable_traversal` is | +| | called. | ++-----------+--------------------------------------------------------------+ Tab Identifiers @@ -663,36 +663,36 @@ Options This widget accepts the following specific options: - .. tabularcolumns:: |l|L| - - +----------+---------------------------------------------------------------+ - | Option | Description | - +==========+===============================================================+ - | orient | One of "horizontal" or "vertical". Specifies the orientation | - | | of the progress bar. | - +----------+---------------------------------------------------------------+ - | length | Specifies the length of the long axis of the progress bar | - | | (width if horizontal, height if vertical). | - +----------+---------------------------------------------------------------+ - | mode | One of "determinate" or "indeterminate". | - +----------+---------------------------------------------------------------+ - | maximum | A number specifying the maximum value. Defaults to 100. | - +----------+---------------------------------------------------------------+ - | value | The current value of the progress bar. In "determinate" mode, | - | | this represents the amount of work completed. In | - | | "indeterminate" mode, it is interpreted as modulo *maximum*; | - | | that is, the progress bar completes one "cycle" when its value| - | | increases by *maximum*. | - +----------+---------------------------------------------------------------+ - | variable | A name which is linked to the option value. If specified, the | - | | value of the progress bar is automatically set to the value of| - | | this name whenever the latter is modified. | - +----------+---------------------------------------------------------------+ - | phase | Read-only option. The widget periodically increments the value| - | | of this option whenever its value is greater than 0 and, in | - | | determinate mode, less than maximum. This option may be used | - | | by the current theme to provide additional animation effects. | - +----------+---------------------------------------------------------------+ +.. tabularcolumns:: |l|L| + ++----------+---------------------------------------------------------------+ +| Option | Description | ++==========+===============================================================+ +| orient | One of "horizontal" or "vertical". Specifies the orientation | +| | of the progress bar. | ++----------+---------------------------------------------------------------+ +| length | Specifies the length of the long axis of the progress bar | +| | (width if horizontal, height if vertical). | ++----------+---------------------------------------------------------------+ +| mode | One of "determinate" or "indeterminate". | ++----------+---------------------------------------------------------------+ +| maximum | A number specifying the maximum value. Defaults to 100. | ++----------+---------------------------------------------------------------+ +| value | The current value of the progress bar. In "determinate" mode, | +| | this represents the amount of work completed. In | +| | "indeterminate" mode, it is interpreted as modulo *maximum*; | +| | that is, the progress bar completes one "cycle" when its value| +| | increases by *maximum*. | ++----------+---------------------------------------------------------------+ +| variable | A name which is linked to the option value. If specified, the | +| | value of the progress bar is automatically set to the value of| +| | this name whenever the latter is modified. | ++----------+---------------------------------------------------------------+ +| phase | Read-only option. The widget periodically increments the value| +| | of this option whenever its value is greater than 0 and, in | +| | determinate mode, less than maximum. This option may be used | +| | by the current theme to provide additional animation effects. | ++----------+---------------------------------------------------------------+ ttk.Progressbar @@ -734,14 +734,14 @@ Options This widget accepts the following specific option: - .. tabularcolumns:: |l|L| +.. tabularcolumns:: |l|L| - +--------+----------------------------------------------------------------+ - | Option | Description | - +========+================================================================+ - | orient | One of "horizontal" or "vertical". Specifies the orientation of| - | | the separator. | - +--------+----------------------------------------------------------------+ ++--------+----------------------------------------------------------------+ +| Option | Description | ++========+================================================================+ +| orient | One of "horizontal" or "vertical". Specifies the orientation of| +| | the separator. | ++--------+----------------------------------------------------------------+ Sizegrip @@ -802,49 +802,49 @@ Options This widget accepts the following specific options: - .. tabularcolumns:: |l|p{0.7\linewidth}| - - +----------------+--------------------------------------------------------+ - | Option | Description | - +================+========================================================+ - | columns | A list of column identifiers, specifying the number of | - | | columns and their names. | - +----------------+--------------------------------------------------------+ - | displaycolumns | A list of column identifiers (either symbolic or | - | | integer indices) specifying which data columns are | - | | displayed and the order in which they appear, or the | - | | string "#all". | - +----------------+--------------------------------------------------------+ - | height | Specifies the number of rows which should be visible. | - | | Note: the requested width is determined from the sum | - | | of the column widths. | - +----------------+--------------------------------------------------------+ - | padding | Specifies the internal padding for the widget. The | - | | padding is a list of up to four length specifications. | - +----------------+--------------------------------------------------------+ - | selectmode | Controls how the built-in class bindings manage the | - | | selection. One of "extended", "browse" or "none". | - | | If set to "extended" (the default), multiple items may | - | | be selected. If "browse", only a single item will be | - | | selected at a time. If "none", the selection will not | - | | be changed. | - | | | - | | Note that the application code and tag bindings can set| - | | the selection however they wish, regardless of the | - | | value of this option. | - +----------------+--------------------------------------------------------+ - | show | A list containing zero or more of the following values,| - | | specifying which elements of the tree to display. | - | | | - | | * tree: display tree labels in column #0. | - | | * headings: display the heading row. | - | | | - | | The default is "tree headings", i.e., show all | - | | elements. | - | | | - | | **Note**: Column #0 always refers to the tree column, | - | | even if show="tree" is not specified. | - +----------------+--------------------------------------------------------+ +.. tabularcolumns:: |l|p{0.7\linewidth}| + ++----------------+--------------------------------------------------------+ +| Option | Description | ++================+========================================================+ +| columns | A list of column identifiers, specifying the number of | +| | columns and their names. | ++----------------+--------------------------------------------------------+ +| displaycolumns | A list of column identifiers (either symbolic or | +| | integer indices) specifying which data columns are | +| | displayed and the order in which they appear, or the | +| | string "#all". | ++----------------+--------------------------------------------------------+ +| height | Specifies the number of rows which should be visible. | +| | Note: the requested width is determined from the sum | +| | of the column widths. | ++----------------+--------------------------------------------------------+ +| padding | Specifies the internal padding for the widget. The | +| | padding is a list of up to four length specifications. | ++----------------+--------------------------------------------------------+ +| selectmode | Controls how the built-in class bindings manage the | +| | selection. One of "extended", "browse" or "none". | +| | If set to "extended" (the default), multiple items may | +| | be selected. If "browse", only a single item will be | +| | selected at a time. If "none", the selection will not | +| | be changed. | +| | | +| | Note that the application code and tag bindings can set| +| | the selection however they wish, regardless of the | +| | value of this option. | ++----------------+--------------------------------------------------------+ +| show | A list containing zero or more of the following values,| +| | specifying which elements of the tree to display. | +| | | +| | * tree: display tree labels in column #0. | +| | * headings: display the heading row. | +| | | +| | The default is "tree headings", i.e., show all | +| | elements. | +| | | +| | **Note**: Column #0 always refers to the tree column, | +| | even if show="tree" is not specified. | ++----------------+--------------------------------------------------------+ Item Options @@ -853,27 +853,27 @@ Item Options The following item options may be specified for items in the insert and item widget commands. - .. tabularcolumns:: |l|L| - - +--------+---------------------------------------------------------------+ - | Option | Description | - +========+===============================================================+ - | text | The textual label to display for the item. | - +--------+---------------------------------------------------------------+ - | image | A Tk Image, displayed to the left of the label. | - +--------+---------------------------------------------------------------+ - | values | The list of values associated with the item. | - | | | - | | Each item should have the same number of values as the widget | - | | option columns. If there are fewer values than columns, the | - | | remaining values are assumed empty. If there are more values | - | | than columns, the extra values are ignored. | - +--------+---------------------------------------------------------------+ - | open | ``True``/``False`` value indicating whether the item's | - | | children should be displayed or hidden. | - +--------+---------------------------------------------------------------+ - | tags | A list of tags associated with this item. | - +--------+---------------------------------------------------------------+ +.. tabularcolumns:: |l|L| + ++--------+---------------------------------------------------------------+ +| Option | Description | ++========+===============================================================+ +| text | The textual label to display for the item. | ++--------+---------------------------------------------------------------+ +| image | A Tk Image, displayed to the left of the label. | ++--------+---------------------------------------------------------------+ +| values | The list of values associated with the item. | +| | | +| | Each item should have the same number of values as the widget | +| | option columns. If there are fewer values than columns, the | +| | remaining values are assumed empty. If there are more values | +| | than columns, the extra values are ignored. | ++--------+---------------------------------------------------------------+ +| open | ``True``/``False`` value indicating whether the item's | +| | children should be displayed or hidden. | ++--------+---------------------------------------------------------------+ +| tags | A list of tags associated with this item. | ++--------+---------------------------------------------------------------+ Tag Options @@ -881,20 +881,20 @@ Tag Options The following options may be specified on tags: - .. tabularcolumns:: |l|L| +.. tabularcolumns:: |l|L| - +------------+-----------------------------------------------------------+ - | Option | Description | - +============+===========================================================+ - | foreground | Specifies the text foreground color. | - +------------+-----------------------------------------------------------+ - | background | Specifies the cell or item background color. | - +------------+-----------------------------------------------------------+ - | font | Specifies the font to use when drawing text. | - +------------+-----------------------------------------------------------+ - | image | Specifies the item image, in case the item's image option | - | | is empty. | - +------------+-----------------------------------------------------------+ ++------------+-----------------------------------------------------------+ +| Option | Description | ++============+===========================================================+ +| foreground | Specifies the text foreground color. | ++------------+-----------------------------------------------------------+ +| background | Specifies the cell or item background color. | ++------------+-----------------------------------------------------------+ +| font | Specifies the font to use when drawing text. | ++------------+-----------------------------------------------------------+ +| image | Specifies the item image, in case the item's image option | +| | is empty. | ++------------+-----------------------------------------------------------+ Column Identifiers @@ -926,19 +926,19 @@ Virtual Events The Treeview widget generates the following virtual events. - .. tabularcolumns:: |l|L| +.. tabularcolumns:: |l|L| - +--------------------+--------------------------------------------------+ - | Event | Description | - +====================+==================================================+ - | <> | Generated whenever the selection changes. | - +--------------------+--------------------------------------------------+ - | <> | Generated just before settings the focus item to | - | | open=True. | - +--------------------+--------------------------------------------------+ - | <> | Generated just after setting the focus item to | - | | open=False. | - +--------------------+--------------------------------------------------+ ++--------------------+--------------------------------------------------+ +| Event | Description | ++====================+==================================================+ +| <> | Generated whenever the selection changes. | ++--------------------+--------------------------------------------------+ +| <> | Generated just before settings the focus item to | +| | open=True. | ++--------------------+--------------------------------------------------+ +| <> | Generated just after setting the focus item to | +| | open=False. | ++--------------------+--------------------------------------------------+ The :meth:`Treeview.focus` and :meth:`Treeview.selection` methods can be used to determine the affected item or items. @@ -986,19 +986,19 @@ ttk.Treeview The valid options/values are: - * id + id Returns the column name. This is a read-only option. - * anchor: One of the standard Tk anchor values. + anchor: One of the standard Tk anchor values. Specifies how the text in this column should be aligned with respect to the cell. - * minwidth: width + minwidth: width The minimum width of the column in pixels. The treeview widget will not make the column any smaller than specified by this option when the widget is resized or the user drags a column. - * stretch: ``True``/``False`` + stretch: ``True``/``False`` Specifies whether the column's width should be adjusted when the widget is resized. - * width: width + width: width The width of the column in pixels. To configure the tree column, call this with column = "#0" @@ -1041,14 +1041,14 @@ ttk.Treeview The valid options/values are: - * text: text + text: text The text to display in the column heading. - * image: imageName + image: imageName Specifies an image to display to the right of the column heading. - * anchor: anchor + anchor: anchor Specifies how the heading text should be aligned. One of the standard Tk anchor values. - * command: callback + command: callback A callback to be invoked when the heading label is pressed. To configure the tree column heading, call this with column = "#0". @@ -1398,25 +1398,25 @@ option. If you don't know the class name of a widget, use the method by statespec/value pairs (this is the imagespec), and *kw* may have the following options: - * border=padding - padding is a list of up to four integers, specifying the left, top, - right, and bottom borders, respectively. + border=padding + padding is a list of up to four integers, specifying the left, top, + right, and bottom borders, respectively. - * height=height - Specifies a minimum height for the element. If less than zero, the - base image's height is used as a default. + height=height + Specifies a minimum height for the element. If less than zero, the + base image's height is used as a default. - * padding=padding - Specifies the element's interior padding. Defaults to border's value - if not specified. + padding=padding + Specifies the element's interior padding. Defaults to border's value + if not specified. - * sticky=spec - Specifies how the image is placed within the final parcel. spec - contains zero or more characters "n", "s", "w", or "e". + sticky=spec + Specifies how the image is placed within the final parcel. spec + contains zero or more characters "n", "s", "w", or "e". - * width=width - Specifies a minimum width for the element. If less than zero, the - base image's width is used as a default. + width=width + Specifies a minimum width for the element. If less than zero, the + base image's width is used as a default. If "from" is used as the value of *etype*, :meth:`element_create` will clone an existing @@ -1504,22 +1504,22 @@ uses a simplified version of the pack geometry manager: given an initial cavity, each element is allocated a parcel. Valid options/values are: - * side: whichside - Specifies which side of the cavity to place the element; one of - top, right, bottom or left. If omitted, the element occupies the - entire cavity. +side: whichside + Specifies which side of the cavity to place the element; one of + top, right, bottom or left. If omitted, the element occupies the + entire cavity. - * sticky: nswe - Specifies where the element is placed inside its allocated parcel. +sticky: nswe + Specifies where the element is placed inside its allocated parcel. - * unit: 0 or 1 - If set to 1, causes the element and all of its descendants to be treated as - a single element for the purposes of :meth:`Widget.identify` et al. It's - used for things like scrollbar thumbs with grips. +unit: 0 or 1 + If set to 1, causes the element and all of its descendants to be treated as + a single element for the purposes of :meth:`Widget.identify` et al. It's + used for things like scrollbar thumbs with grips. - * children: [sublayout... ] - Specifies a list of elements to place inside the element. Each - element is a tuple (or other sequence type) where the first item is - the layout name, and the other is a `Layout`_. +children: [sublayout... ] + Specifies a list of elements to place inside the element. Each + element is a tuple (or other sequence type) where the first item is + the layout name, and the other is a `Layout`_. .. _Layout: `Layouts`_ diff --git a/Doc/library/unittest.mock.rst b/Doc/library/unittest.mock.rst index 8d877165a09740..d3398e9407d40f 100644 --- a/Doc/library/unittest.mock.rst +++ b/Doc/library/unittest.mock.rst @@ -813,8 +813,8 @@ This applies to :meth:`~Mock.assert_called_with`, :meth:`~Mock.assert_any_call`. When :ref:`auto-speccing`, it will also apply to method calls on the mock object. - .. versionchanged:: 3.4 - Added signature introspection on specced and autospecced mock objects. +.. versionchanged:: 3.4 + Added signature introspection on specced and autospecced mock objects. .. class:: PropertyMock(*args, **kwargs) @@ -1387,9 +1387,9 @@ patch .. note:: - .. versionchanged:: 3.5 - If you are patching builtins in a module then you don't - need to pass ``create=True``, it will be added by default. + .. versionchanged:: 3.5 + If you are patching builtins in a module then you don't + need to pass ``create=True``, it will be added by default. Patch can be used as a :class:`TestCase` class decorator. It works by decorating each test method in the class. This reduces the boilerplate diff --git a/Doc/library/urllib.request.rst b/Doc/library/urllib.request.rst index 35b8f5b471dd96..002dab8a65195c 100644 --- a/Doc/library/urllib.request.rst +++ b/Doc/library/urllib.request.rst @@ -315,10 +315,10 @@ The following classes are provided: list of hostname suffixes, optionally with ``:port`` appended, for example ``cern.ch,ncsa.uiuc.edu,some.host:8080``. - .. note:: + .. note:: - ``HTTP_PROXY`` will be ignored if a variable ``REQUEST_METHOD`` is set; - see the documentation on :func:`~urllib.request.getproxies`. + ``HTTP_PROXY`` will be ignored if a variable ``REQUEST_METHOD`` is set; + see the documentation on :func:`~urllib.request.getproxies`. .. class:: HTTPPasswordMgr() @@ -1536,9 +1536,9 @@ some point in the future. :mod:`urllib.request` Restrictions ---------------------------------- - .. index:: - pair: HTTP; protocol - pair: FTP; protocol +.. index:: + pair: HTTP; protocol + pair: FTP; protocol * Currently, only the following protocols are supported: HTTP (versions 0.9 and 1.0), FTP, local files, and data URLs. From 90b2620b6e7535e94bab802e9084a64c8670fcee Mon Sep 17 00:00:00 2001 From: Ezio Melotti Date: Wed, 11 Oct 2023 23:12:53 +0200 Subject: [PATCH 066/265] [3.11] gh-110631: Fix reST indentation (GH-110724) (#110739) * Fix wrong indentation in the other dirs. * Fix more wrong indentation.. (cherry picked from commit 718391f475f2550d99dd794069ca76312f7f6aa6) --- Doc/c-api/memory.rst | 24 ++++++++++++------------ Doc/howto/enum.rst | 15 ++++++++------- Doc/howto/instrumentation.rst | 14 ++++++-------- Doc/library/csv.rst | 6 +++--- Doc/library/dataclasses.rst | 12 +++++------- Doc/library/decimal.rst | 8 ++++---- Doc/using/windows.rst | 25 +++++++++++++------------ 7 files changed, 51 insertions(+), 53 deletions(-) diff --git a/Doc/c-api/memory.rst b/Doc/c-api/memory.rst index 8968b26b64320a..87b8603c4c0b10 100644 --- a/Doc/c-api/memory.rst +++ b/Doc/c-api/memory.rst @@ -487,18 +487,18 @@ Customize Memory Allocators :c:func:`PyMem_SetAllocator` does have the following contract: - * It can be called after :c:func:`Py_PreInitialize` and before - :c:func:`Py_InitializeFromConfig` to install a custom memory - allocator. There are no restrictions over the installed allocator - other than the ones imposed by the domain (for instance, the Raw - Domain allows the allocator to be called without the GIL held). See - :ref:`the section on allocator domains ` for more - information. - - * If called after Python has finish initializing (after - :c:func:`Py_InitializeFromConfig` has been called) the allocator - **must** wrap the existing allocator. Substituting the current - allocator for some other arbitrary one is **not supported**. + * It can be called after :c:func:`Py_PreInitialize` and before + :c:func:`Py_InitializeFromConfig` to install a custom memory + allocator. There are no restrictions over the installed allocator + other than the ones imposed by the domain (for instance, the Raw + Domain allows the allocator to be called without the GIL held). See + :ref:`the section on allocator domains ` for more + information. + + * If called after Python has finish initializing (after + :c:func:`Py_InitializeFromConfig` has been called) the allocator + **must** wrap the existing allocator. Substituting the current + allocator for some other arbitrary one is **not supported**. diff --git a/Doc/howto/enum.rst b/Doc/howto/enum.rst index 465be653b51adb..58849429539cca 100644 --- a/Doc/howto/enum.rst +++ b/Doc/howto/enum.rst @@ -1119,13 +1119,14 @@ the following are true: There is a new boundary mechanism that controls how out-of-range / invalid bits are handled: ``STRICT``, ``CONFORM``, ``EJECT``, and ``KEEP``: - * STRICT --> raises an exception when presented with invalid values - * CONFORM --> discards any invalid bits - * EJECT --> lose Flag status and become a normal int with the given value - * KEEP --> keep the extra bits - - keeps Flag status and extra bits - - extra bits do not show up in iteration - - extra bits do show up in repr() and str() +* STRICT --> raises an exception when presented with invalid values +* CONFORM --> discards any invalid bits +* EJECT --> lose Flag status and become a normal int with the given value +* KEEP --> keep the extra bits + + - keeps Flag status and extra bits + - extra bits do not show up in iteration + - extra bits do show up in repr() and str() The default for Flag is ``STRICT``, the default for ``IntFlag`` is ``EJECT``, and the default for ``_convert_`` is ``KEEP`` (see ``ssl.Options`` for an diff --git a/Doc/howto/instrumentation.rst b/Doc/howto/instrumentation.rst index 875f846aad0051..9c99fcecce1fcb 100644 --- a/Doc/howto/instrumentation.rst +++ b/Doc/howto/instrumentation.rst @@ -13,9 +13,9 @@ DTrace and SystemTap are monitoring tools, each providing a way to inspect what the processes on a computer system are doing. They both use domain-specific languages allowing a user to write scripts which: - - filter which processes are to be observed - - gather data from the processes of interest - - generate reports on the data +- filter which processes are to be observed +- gather data from the processes of interest +- generate reports on the data As of Python 3.6, CPython can be built with embedded "markers", also known as "probes", that can be observed by a DTrace or SystemTap script, @@ -246,11 +246,9 @@ The output looks like this: where the columns are: - - time in microseconds since start of script - - - name of executable - - - PID of process +- time in microseconds since start of script +- name of executable +- PID of process and the remainder indicates the call/return hierarchy as the script executes. diff --git a/Doc/library/csv.rst b/Doc/library/csv.rst index 0aad2a0d523cb0..52f76304a10529 100644 --- a/Doc/library/csv.rst +++ b/Doc/library/csv.rst @@ -284,9 +284,9 @@ The :mod:`csv` module defines the following classes: Inspecting each column, one of two key criteria will be considered to estimate if the sample contains a header: - - the second through n-th rows contain numeric values - - the second through n-th rows contain strings where at least one value's - length differs from that of the putative header of that column. + - the second through n-th rows contain numeric values + - the second through n-th rows contain strings where at least one value's + length differs from that of the putative header of that column. Twenty rows after the first row are sampled; if more than half of columns + rows meet the criteria, :const:`True` is returned. diff --git a/Doc/library/dataclasses.rst b/Doc/library/dataclasses.rst index 09f4ce812dc83a..915ff05d98a711 100644 --- a/Doc/library/dataclasses.rst +++ b/Doc/library/dataclasses.rst @@ -319,13 +319,11 @@ Module contents module-level method (see below). Users should never instantiate a :class:`Field` object directly. Its documented attributes are: - - ``name``: The name of the field. - - - ``type``: The type of the field. - - - ``default``, ``default_factory``, ``init``, ``repr``, ``hash``, - ``compare``, ``metadata``, and ``kw_only`` have the identical - meaning and values as they do in the :func:`field` function. + - ``name``: The name of the field. + - ``type``: The type of the field. + - ``default``, ``default_factory``, ``init``, ``repr``, ``hash``, + ``compare``, ``metadata``, and ``kw_only`` have the identical + meaning and values as they do in the :func:`field` function. Other attributes may exist, but they are private and must not be inspected or relied on. diff --git a/Doc/library/decimal.rst b/Doc/library/decimal.rst index d3e920da5df368..b4e3430b1008f1 100644 --- a/Doc/library/decimal.rst +++ b/Doc/library/decimal.rst @@ -1396,10 +1396,10 @@ In addition to the three supplied contexts, new contexts can be created with the With three arguments, compute ``(x**y) % modulo``. For the three argument form, the following restrictions on the arguments hold: - - all three arguments must be integral - - ``y`` must be nonnegative - - at least one of ``x`` or ``y`` must be nonzero - - ``modulo`` must be nonzero and have at most 'precision' digits + - all three arguments must be integral + - ``y`` must be nonnegative + - at least one of ``x`` or ``y`` must be nonzero + - ``modulo`` must be nonzero and have at most 'precision' digits The value resulting from ``Context.power(x, y, modulo)`` is equal to the value that would be obtained by computing ``(x**y) diff --git a/Doc/using/windows.rst b/Doc/using/windows.rst index 4c71b1db82cc6f..72c3a2b230a326 100644 --- a/Doc/using/windows.rst +++ b/Doc/using/windows.rst @@ -1174,21 +1174,22 @@ Otherwise, your users may experience problems using your application. Note that the first suggestion is the best, as the others may still be susceptible to non-standard paths in the registry and user site-packages. -.. versionchanged:: - 3.6 +.. versionchanged:: 3.6 + + Add ``._pth`` file support and removes ``applocal`` option from + ``pyvenv.cfg``. + +.. versionchanged:: 3.6 - * Adds ``._pth`` file support and removes ``applocal`` option from - ``pyvenv.cfg``. - * Adds :file:`python{XX}.zip` as a potential landmark when directly adjacent - to the executable. + Add :file:`python{XX}.zip` as a potential landmark when directly adjacent + to the executable. -.. deprecated:: - 3.6 +.. deprecated:: 3.6 - Modules specified in the registry under ``Modules`` (not ``PythonPath``) - may be imported by :class:`importlib.machinery.WindowsRegistryFinder`. - This finder is enabled on Windows in 3.6.0 and earlier, but may need to - be explicitly added to :data:`sys.meta_path` in the future. + Modules specified in the registry under ``Modules`` (not ``PythonPath``) + may be imported by :class:`importlib.machinery.WindowsRegistryFinder`. + This finder is enabled on Windows in 3.6.0 and earlier, but may need to + be explicitly added to :data:`sys.meta_path` in the future. Additional modules ================== From eb492d32b688534dfa0f956c0e7fa157dfacc2c3 Mon Sep 17 00:00:00 2001 From: Ezio Melotti Date: Thu, 12 Oct 2023 02:01:48 +0200 Subject: [PATCH 067/265] [3.11] gh-110631: Fix reST indentation in `Doc/reference` (GH-110708) (#110741) Fix wrong indentation in the Doc/reference dir.. (cherry picked from commit 41d8ec5a1bae1e5d4452da0a1a0649ace4ecb7b0) --- Doc/reference/compound_stmts.rst | 48 +++++++-------- Doc/reference/expressions.rst | 4 +- Doc/reference/import.rst | 94 +++++++++++++++--------------- Doc/reference/lexical_analysis.rst | 8 +-- 4 files changed, 78 insertions(+), 76 deletions(-) diff --git a/Doc/reference/compound_stmts.rst b/Doc/reference/compound_stmts.rst index fb0daf89357f90..19df7f63f12536 100644 --- a/Doc/reference/compound_stmts.rst +++ b/Doc/reference/compound_stmts.rst @@ -644,14 +644,14 @@ Here's an overview of the logical flow of a match statement: specified below. **Name bindings made during a successful pattern match outlive the executed block and can be used after the match statement**. - .. note:: + .. note:: - During failed pattern matches, some subpatterns may succeed. Do not - rely on bindings being made for a failed match. Conversely, do not - rely on variables remaining unchanged after a failed match. The exact - behavior is dependent on implementation and may vary. This is an - intentional decision made to allow different implementations to add - optimizations. + During failed pattern matches, some subpatterns may succeed. Do not + rely on bindings being made for a failed match. Conversely, do not + rely on variables remaining unchanged after a failed match. The exact + behavior is dependent on implementation and may vary. This is an + intentional decision made to allow different implementations to add + optimizations. #. If the pattern succeeds, the corresponding guard (if present) is evaluated. In this case all name bindings are guaranteed to have happened. @@ -1172,8 +1172,10 @@ In simple terms ``CLS(P1, attr=P2)`` matches only if the following happens: * ``isinstance(, CLS)`` * convert ``P1`` to a keyword pattern using ``CLS.__match_args__`` * For each keyword argument ``attr=P2``: - * ``hasattr(, "attr")`` - * ``P2`` matches ``.attr`` + + * ``hasattr(, "attr")`` + * ``P2`` matches ``.attr`` + * ... and so on for the corresponding keyword argument/pattern pair. .. seealso:: @@ -1600,29 +1602,29 @@ body of a coroutine function. .. [#] In pattern matching, a sequence is defined as one of the following: - * a class that inherits from :class:`collections.abc.Sequence` - * a Python class that has been registered as :class:`collections.abc.Sequence` - * a builtin class that has its (CPython) :c:macro:`Py_TPFLAGS_SEQUENCE` bit set - * a class that inherits from any of the above + * a class that inherits from :class:`collections.abc.Sequence` + * a Python class that has been registered as :class:`collections.abc.Sequence` + * a builtin class that has its (CPython) :c:macro:`Py_TPFLAGS_SEQUENCE` bit set + * a class that inherits from any of the above The following standard library classes are sequences: - * :class:`array.array` - * :class:`collections.deque` - * :class:`list` - * :class:`memoryview` - * :class:`range` - * :class:`tuple` + * :class:`array.array` + * :class:`collections.deque` + * :class:`list` + * :class:`memoryview` + * :class:`range` + * :class:`tuple` .. note:: Subject values of type ``str``, ``bytes``, and ``bytearray`` do not match sequence patterns. .. [#] In pattern matching, a mapping is defined as one of the following: - * a class that inherits from :class:`collections.abc.Mapping` - * a Python class that has been registered as :class:`collections.abc.Mapping` - * a builtin class that has its (CPython) :c:macro:`Py_TPFLAGS_MAPPING` bit set - * a class that inherits from any of the above + * a class that inherits from :class:`collections.abc.Mapping` + * a Python class that has been registered as :class:`collections.abc.Mapping` + * a builtin class that has its (CPython) :c:macro:`Py_TPFLAGS_MAPPING` bit set + * a class that inherits from any of the above The standard library classes :class:`dict` and :class:`types.MappingProxyType` are mappings. diff --git a/Doc/reference/expressions.rst b/Doc/reference/expressions.rst index 8320cd0ae75193..3c4706714f306c 100644 --- a/Doc/reference/expressions.rst +++ b/Doc/reference/expressions.rst @@ -499,8 +499,8 @@ the yield expression. It can be either set explicitly when raising :exc:`StopIteration`, or automatically when the subiterator is a generator (by returning a value from the subgenerator). - .. versionchanged:: 3.3 - Added ``yield from `` to delegate control flow to a subiterator. +.. versionchanged:: 3.3 + Added ``yield from `` to delegate control flow to a subiterator. The parentheses may be omitted when the yield expression is the sole expression on the right hand side of an assignment statement. diff --git a/Doc/reference/import.rst b/Doc/reference/import.rst index 70d946ab4db9e2..c44fb9815e9f99 100644 --- a/Doc/reference/import.rst +++ b/Doc/reference/import.rst @@ -373,32 +373,32 @@ of what happens during the loading portion of import:: Note the following details: - * If there is an existing module object with the given name in - :data:`sys.modules`, import will have already returned it. +* If there is an existing module object with the given name in + :data:`sys.modules`, import will have already returned it. - * The module will exist in :data:`sys.modules` before the loader - executes the module code. This is crucial because the module code may - (directly or indirectly) import itself; adding it to :data:`sys.modules` - beforehand prevents unbounded recursion in the worst case and multiple - loading in the best. +* The module will exist in :data:`sys.modules` before the loader + executes the module code. This is crucial because the module code may + (directly or indirectly) import itself; adding it to :data:`sys.modules` + beforehand prevents unbounded recursion in the worst case and multiple + loading in the best. - * If loading fails, the failing module -- and only the failing module -- - gets removed from :data:`sys.modules`. Any module already in the - :data:`sys.modules` cache, and any module that was successfully loaded - as a side-effect, must remain in the cache. This contrasts with - reloading where even the failing module is left in :data:`sys.modules`. +* If loading fails, the failing module -- and only the failing module -- + gets removed from :data:`sys.modules`. Any module already in the + :data:`sys.modules` cache, and any module that was successfully loaded + as a side-effect, must remain in the cache. This contrasts with + reloading where even the failing module is left in :data:`sys.modules`. - * After the module is created but before execution, the import machinery - sets the import-related module attributes ("_init_module_attrs" in - the pseudo-code example above), as summarized in a - :ref:`later section `. +* After the module is created but before execution, the import machinery + sets the import-related module attributes ("_init_module_attrs" in + the pseudo-code example above), as summarized in a + :ref:`later section `. - * Module execution is the key moment of loading in which the module's - namespace gets populated. Execution is entirely delegated to the - loader, which gets to decide what gets populated and how. +* Module execution is the key moment of loading in which the module's + namespace gets populated. Execution is entirely delegated to the + loader, which gets to decide what gets populated and how. - * The module created during loading and passed to exec_module() may - not be the one returned at the end of import [#fnlo]_. +* The module created during loading and passed to exec_module() may + not be the one returned at the end of import [#fnlo]_. .. versionchanged:: 3.4 The import system has taken over the boilerplate responsibilities of @@ -415,13 +415,13 @@ returned from :meth:`~importlib.abc.Loader.exec_module` is ignored. Loaders must satisfy the following requirements: - * If the module is a Python module (as opposed to a built-in module or a - dynamically loaded extension), the loader should execute the module's code - in the module's global name space (``module.__dict__``). +* If the module is a Python module (as opposed to a built-in module or a + dynamically loaded extension), the loader should execute the module's code + in the module's global name space (``module.__dict__``). - * If the loader cannot execute the module, it should raise an - :exc:`ImportError`, although any other exception raised during - :meth:`~importlib.abc.Loader.exec_module` will be propagated. +* If the loader cannot execute the module, it should raise an + :exc:`ImportError`, although any other exception raised during + :meth:`~importlib.abc.Loader.exec_module` will be propagated. In many cases, the finder and loader can be the same object; in such cases the :meth:`~importlib.abc.MetaPathFinder.find_spec` method would just return a @@ -451,20 +451,20 @@ import machinery will create the new module itself. functionality described above in addition to executing the module. All the same constraints apply, with some additional clarification: - * If there is an existing module object with the given name in - :data:`sys.modules`, the loader must use that existing module. - (Otherwise, :func:`importlib.reload` will not work correctly.) If the - named module does not exist in :data:`sys.modules`, the loader - must create a new module object and add it to :data:`sys.modules`. + * If there is an existing module object with the given name in + :data:`sys.modules`, the loader must use that existing module. + (Otherwise, :func:`importlib.reload` will not work correctly.) If the + named module does not exist in :data:`sys.modules`, the loader + must create a new module object and add it to :data:`sys.modules`. - * The module *must* exist in :data:`sys.modules` before the loader - executes the module code, to prevent unbounded recursion or multiple - loading. + * The module *must* exist in :data:`sys.modules` before the loader + executes the module code, to prevent unbounded recursion or multiple + loading. - * If loading fails, the loader must remove any modules it has inserted - into :data:`sys.modules`, but it must remove **only** the failing - module(s), and only if the loader itself has loaded the module(s) - explicitly. + * If loading fails, the loader must remove any modules it has inserted + into :data:`sys.modules`, but it must remove **only** the failing + module(s), and only if the loader itself has loaded the module(s) + explicitly. .. versionchanged:: 3.5 A :exc:`DeprecationWarning` is raised when ``exec_module()`` is defined but @@ -664,17 +664,17 @@ with defaults for whatever information is missing. Here are the exact rules used: - * If the module has a ``__spec__`` attribute, the information in the spec - is used to generate the repr. The "name", "loader", "origin", and - "has_location" attributes are consulted. +* If the module has a ``__spec__`` attribute, the information in the spec + is used to generate the repr. The "name", "loader", "origin", and + "has_location" attributes are consulted. - * If the module has a ``__file__`` attribute, this is used as part of the - module's repr. +* If the module has a ``__file__`` attribute, this is used as part of the + module's repr. - * If the module has no ``__file__`` but does have a ``__loader__`` that is not - ``None``, then the loader's repr is used as part of the module's repr. +* If the module has no ``__file__`` but does have a ``__loader__`` that is not + ``None``, then the loader's repr is used as part of the module's repr. - * Otherwise, just use the module's ``__name__`` in the repr. +* Otherwise, just use the module's ``__name__`` in the repr. .. versionchanged:: 3.4 Use of :meth:`loader.module_repr() ` diff --git a/Doc/reference/lexical_analysis.rst b/Doc/reference/lexical_analysis.rst index 4fadae3ea2b045..b25d2c862927fe 100644 --- a/Doc/reference/lexical_analysis.rst +++ b/Doc/reference/lexical_analysis.rst @@ -641,10 +641,10 @@ is more easily recognized as broken.) It is also important to note that the escape sequences only recognized in string literals fall into the category of unrecognized escapes for bytes literals. - .. versionchanged:: 3.6 - Unrecognized escape sequences produce a :exc:`DeprecationWarning`. In - a future Python version they will be a :exc:`SyntaxWarning` and - eventually a :exc:`SyntaxError`. +.. versionchanged:: 3.6 + Unrecognized escape sequences produce a :exc:`DeprecationWarning`. In + a future Python version they will be a :exc:`SyntaxWarning` and + eventually a :exc:`SyntaxError`. Even in a raw literal, quotes can be escaped with a backslash, but the backslash remains in the result; for example, ``r"\""`` is a valid string From f98903542c55823bf0f149840d81729003ffc9d0 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 12 Oct 2023 10:09:49 +0200 Subject: [PATCH 068/265] [3.11] gh-110673: test_pty raises on short write (GH-110677) (#110743) gh-110673: test_pty raises on short write (GH-110677) Add write_all() helper function to test_pty to raise an exception on short write: if os.writes() does not write all bytes. It should not happen for a PTY. (cherry picked from commit b4e8049766a46a9e6548b18d7e9a0c9f573cd122) Co-authored-by: Victor Stinner --- Lib/test/test_pty.py | 19 ++++++++++++++----- 1 file changed, 14 insertions(+), 5 deletions(-) diff --git a/Lib/test/test_pty.py b/Lib/test/test_pty.py index a971f6b0250efb..f31a68c5d84e03 100644 --- a/Lib/test/test_pty.py +++ b/Lib/test/test_pty.py @@ -76,6 +76,15 @@ def expectedFailureIfStdinIsTTY(fun): pass return fun + +def write_all(fd, data): + written = os.write(fd, data) + if written != len(data): + # gh-73256, gh-110673: It should never happen, but check just in case + raise Exception(f"short write: os.write({fd}, {len(data)} bytes) " + f"wrote {written} bytes") + + # Marginal testing of pty suite. Cannot do extensive 'do or fail' testing # because pty code is not too portable. class PtyTest(unittest.TestCase): @@ -170,14 +179,14 @@ def test_openpty(self): os.set_blocking(master_fd, blocking) debug("Writing to slave_fd") - os.write(slave_fd, TEST_STRING_1) + write_all(slave_fd, TEST_STRING_1) s1 = _readline(master_fd) self.assertEqual(b'I wish to buy a fish license.\n', normalize_output(s1)) debug("Writing chunked output") - os.write(slave_fd, TEST_STRING_2[:5]) - os.write(slave_fd, TEST_STRING_2[5:]) + write_all(slave_fd, TEST_STRING_2[:5]) + write_all(slave_fd, TEST_STRING_2[5:]) s2 = _readline(master_fd) self.assertEqual(b'For my pet fish, Eric.\n', normalize_output(s2)) @@ -360,8 +369,8 @@ def test__copy_to_each(self): masters = [s.fileno() for s in socketpair] # Feed data. Smaller than PIPEBUF. These writes will not block. - os.write(masters[1], b'from master') - os.write(write_to_stdin_fd, b'from stdin') + write_all(masters[1], b'from master') + write_all(write_to_stdin_fd, b'from stdin') # Expect three select calls, the last one will cause IndexError pty.select = self._mock_select From 5178fb0b89586be2b572bae7fc04a7e63168eee8 Mon Sep 17 00:00:00 2001 From: "Erlend E. Aasland" Date: Thu, 12 Oct 2023 11:52:59 +0200 Subject: [PATCH 069/265] [3.11] GH-107518: Remove the Argument Clinic How-To (#109900) (#110761) (cherry picked from commit d1f7fae424d51b0374c8204599583c4a26c1a992) * Remove the content of the Argument Clinic HOWTO * Update cross-references to the Argument Clinic * Add a note directing readers to the devguide Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com> --- Doc/howto/clinic.rst | 1926 +--------------------------------------- Doc/howto/index.rst | 1 - Doc/whatsnew/3.8.rst | 2 +- Misc/NEWS.d/3.11.5.rst | 2 +- 4 files changed, 7 insertions(+), 1924 deletions(-) diff --git a/Doc/howto/clinic.rst b/Doc/howto/clinic.rst index 5f4d3977bc1605..060977246149cf 100644 --- a/Doc/howto/clinic.rst +++ b/Doc/howto/clinic.rst @@ -1,1930 +1,14 @@ -.. highlight:: c +:orphan: -.. _howto-clinic: +.. This page is retained solely for existing links to /howto/clinic.html. + Direct readers to the devguide. ********************** Argument Clinic How-To ********************** -:author: Larry Hastings - -**Source code:** :source:`Tools/clinic/clinic.py`. - -.. topic:: Abstract - - Argument Clinic is a preprocessor for CPython C files. - It was introduced in Python 3.4 with :pep:`436`, - in order to provide introspection signatures, - and to generate performant and tailor-made boilerplate code - for argument parsing in CPython builtins, - module level functions, and class methods. - This document is divided in four major sections: - - * :ref:`clinic-background` talks about the basic concepts and goals of - Argument Clinic. - * :ref:`clinic-reference` describes the command-line interface and Argument - Clinic terminology. - * :ref:`clinic-tutorial` guides you through all the steps required to - adapt an existing C function to Argument Clinic. - * :ref:`clinic-howtos` details how to handle specific tasks. - - -.. note:: - - Argument Clinic is considered internal-only - for CPython. Its use is not supported for files outside - CPython, and no guarantees are made regarding backwards - compatibility for future versions. In other words: if you - maintain an external C extension for CPython, you're welcome - to experiment with Argument Clinic in your own code. But the - version of Argument Clinic that ships with the next version - of CPython *could* be totally incompatible and break all your code. - - -.. _clinic-background: - -Background -========== - -Basic concepts --------------- - -When Argument Clinic is run on a file, either via the :ref:`clinic-cli` -or via ``make clinic``, it will scan over the input files looking for -:term:`start lines `: - -.. code-block:: none - - /*[clinic input] - -When it finds one, it reads everything up to the :term:`end line`: - -.. code-block:: none - - [clinic start generated code]*/ - -Everything in between these two lines is Argument Clinic :term:`input`. -When Argument Clinic parses input, it generates :term:`output`. -The output is rewritten into the C file immediately after the input, -followed by a :term:`checksum line`. -All of these lines, including the :term:`start line` and :term:`checksum line`, -are collectively called an Argument Clinic :term:`block`: - -.. code-block:: none - - /*[clinic input] - ... clinic input goes here ... - [clinic start generated code]*/ - ... clinic output goes here ... - /*[clinic end generated code: ...]*/ - -If you run Argument Clinic on the same file a second time, Argument Clinic -will discard the old :term:`output` and write out the new output with a fresh -:term:`checksum line`. -If the :term:`input` hasn't changed, the output won't change either. .. note:: - You should never modify the output of an Argument Clinic block, - as any change will be lost in future Argument Clinic runs; - Argument Clinic will detect an output checksum mismatch and regenerate the - correct output. - If you are not happy with the generated output, - you should instead change the input until it produces the output you want. - - -.. _clinic-reference: - -Reference -========= - - -.. _clinic-terminology: - -Terminology ------------ - -.. glossary:: - - start line - The line ``/*[clinic input]``. - This line marks the beginning of Argument Clinic input. - Note that the *start line* opens a C block comment. - - end line - The line ``[clinic start generated code]*/``. - The *end line* marks the _end_ of Argument Clinic :term:`input`, - but at the same time marks the _start_ of Argument Clinic :term:`output`, - thus the text *"clinic start start generated code"* - Note that the *end line* closes the C block comment opened - by the *start line*. - - checksum - A hash to distinguish unique :term:`inputs ` - and :term:`outputs `. - - checksum line - A line that looks like ``/*[clinic end generated code: ...]*/``. - The three dots will be replaced by a :term:`checksum` generated from the - :term:`input`, and a :term:`checksum` generated from the :term:`output`. - The checksum line marks the end of Argument Clinic generated code, - and is used by Argument Clinic to determine if it needs to regenerate - output. - - input - The text between the :term:`start line` and the :term:`end line`. - Note that the start and end lines open and close a C block comment; - the *input* is thus a part of that same C block comment. - - output - The text between the :term:`end line` and the :term:`checksum line`. - - block - All text from the :term:`start line` to the :term:`checksum line` inclusively. - - -.. _clinic-cli: - -Command-line interface ----------------------- - -The Argument Clinic :abbr:`CLI (Command-Line Interface)` is typically used to -process a single source file, like this: - -.. code-block:: shell-session - - $ python3 ./Tools/clinic/clinic.py foo.c - -The CLI supports the following options: - -.. program:: ./Tools/clinic/clinic.py [-h] [-f] [-o OUTPUT] [-v] \ - [--converters] [--make] [--srcdir SRCDIR] [FILE ...] - -.. option:: -h, --help - - Print CLI usage. - -.. option:: -f, --force - - Force output regeneration. - -.. option:: -o, --output OUTPUT - - Redirect file output to OUTPUT - -.. option:: -v, --verbose - - Enable verbose mode. - -.. option:: --converters - - Print a list of all supported converters and return converters. - -.. option:: --make - - Walk :option:`--srcdir` to run over all relevant files. - -.. option:: --srcdir SRCDIR - - The directory tree to walk in :option:`--make` mode. - -.. option:: FILE ... - - The list of files to process. - - -.. _clinic-classes: - -Classes for extending Argument Clinic -------------------------------------- - -.. module:: clinic - -.. class:: CConverter - - The base class for all converters. - See :ref:`clinic-howto-custom-converter` for how to subclass this class. - - .. attribute:: type - - The C type to use for this variable. - :attr:`!type` should be a Python string specifying the type, - e.g. ``'int'``. - If this is a pointer type, the type string should end with ``' *'``. - - .. attribute:: default - - The Python default value for this parameter, as a Python value. - Or the magic value ``unspecified`` if there is no default. - - .. attribute:: py_default - - :attr:`!default` as it should appear in Python code, - as a string. - Or ``None`` if there is no default. - - .. attribute:: c_default - - :attr:`!default` as it should appear in C code, - as a string. - Or ``None`` if there is no default. - - .. attribute:: c_ignored_default - - The default value used to initialize the C variable when - there is no default, but not specifying a default may - result in an "uninitialized variable" warning. This can - easily happen when using option groups—although - properly written code will never actually use this value, - the variable does get passed in to the impl, and the - C compiler will complain about the "use" of the - uninitialized value. This value should always be a - non-empty string. - - .. attribute:: converter - - The name of the C converter function, as a string. - - .. attribute:: impl_by_reference - - A boolean value. If true, - Argument Clinic will add a ``&`` in front of the name of - the variable when passing it into the impl function. - - .. attribute:: parse_by_reference - - A boolean value. If true, - Argument Clinic will add a ``&`` in front of the name of - the variable when passing it into :c:func:`PyArg_ParseTuple`. - - -.. _clinic-tutorial: - -Tutorial -======== - -The best way to get a sense of how Argument Clinic works is to -convert a function to work with it. Here, then, are the bare -minimum steps you'd need to follow to convert a function to -work with Argument Clinic. Note that for code you plan to -check in to CPython, you really should take the conversion farther, -using some of the :ref:`advanced concepts ` -you'll see later on in the document, -like :ref:`clinic-howto-return-converters` -and :ref:`clinic-howto-self-converter`. -But we'll keep it simple for this walkthrough so you can learn. - -First, make sure you're working with a freshly updated checkout -of the CPython trunk. - -Next, find a Python builtin that calls either :c:func:`PyArg_ParseTuple` -or :c:func:`PyArg_ParseTupleAndKeywords`, and hasn't been converted -to work with Argument Clinic yet. -For this tutorial, we'll be using -:py:meth:`_pickle.Pickler.dump `. - -If the call to the :c:func:`!PyArg_Parse*` function uses any of the -following format units...: - - .. code-block:: none - - O& - O! - es - es# - et - et# - -... or if it has multiple calls to :c:func:`PyArg_ParseTuple`, -you should choose a different function. -(See :ref:`clinic-howto-advanced-converters` for those scenarios.) - -Also, if the function has multiple calls to :c:func:`!PyArg_ParseTuple` -or :c:func:`PyArg_ParseTupleAndKeywords` where it supports different -types for the same argument, or if the function uses something besides -:c:func:`!PyArg_Parse*` functions to parse its arguments, it probably -isn't suitable for conversion to Argument Clinic. Argument Clinic -doesn't support generic functions or polymorphic parameters. - -Next, add the following boilerplate above the function, -creating our input block:: - - /*[clinic input] - [clinic start generated code]*/ - -Cut the docstring and paste it in between the ``[clinic]`` lines, -removing all the junk that makes it a properly quoted C string. -When you're done you should have just the text, based at the left -margin, with no line wider than 80 characters. -Argument Clinic will preserve indents inside the docstring. - -If the old docstring had a first line that looked like a function -signature, throw that line away; The docstring doesn't need it anymore --- -when you use :py:func:`help` on your builtin in the future, -the first line will be built automatically based on the function's signature. - -Example docstring summary line:: - - /*[clinic input] - Write a pickled representation of obj to the open file. - [clinic start generated code]*/ - -If your docstring doesn't have a "summary" line, Argument Clinic will -complain, so let's make sure it has one. The "summary" line should -be a paragraph consisting of a single 80-column line -at the beginning of the docstring. -(See :pep:`257` regarding docstring conventions.) - -Our example docstring consists solely of a summary line, so the sample -code doesn't have to change for this step. - -Now, above the docstring, enter the name of the function, followed -by a blank line. This should be the Python name of the function, -and should be the full dotted path to the function --- -it should start with the name of the module, -include any sub-modules, and if the function is a method on -a class it should include the class name too. - -In our example, :mod:`!_pickle` is the module, :py:class:`!Pickler` is the class, -and :py:meth:`!dump` is the method, so the name becomes -:py:meth:`!_pickle.Pickler.dump`:: - - /*[clinic input] - _pickle.Pickler.dump - - Write a pickled representation of obj to the open file. - [clinic start generated code]*/ - -If this is the first time that module or class has been used with Argument -Clinic in this C file, -you must declare the module and/or class. Proper Argument Clinic hygiene -prefers declaring these in a separate block somewhere near the -top of the C file, in the same way that include files and statics go at -the top. -In our sample code we'll just show the two blocks next to each other. - -The name of the class and module should be the same as the one -seen by Python. Check the name defined in the :c:type:`PyModuleDef` -or :c:type:`PyTypeObject` as appropriate. - -When you declare a class, you must also specify two aspects of its type -in C: the type declaration you'd use for a pointer to an instance of -this class, and a pointer to the :c:type:`!PyTypeObject` for this class:: - - /*[clinic input] - module _pickle - class _pickle.Pickler "PicklerObject *" "&Pickler_Type" - [clinic start generated code]*/ - - /*[clinic input] - _pickle.Pickler.dump - - Write a pickled representation of obj to the open file. - [clinic start generated code]*/ - -Declare each of the parameters to the function. Each parameter -should get its own line. All the parameter lines should be -indented from the function name and the docstring. -The general form of these parameter lines is as follows: - -.. code-block:: none - - name_of_parameter: converter - -If the parameter has a default value, add that after the -converter: - -.. code-block:: none - - name_of_parameter: converter = default_value - -Argument Clinic's support for "default values" is quite sophisticated; -see :ref:`clinic-howto-default-values` for more information. - -Next, add a blank line below the parameters. - -What's a "converter"? -It establishes both the type of the variable used in C, -and the method to convert the Python value into a C value at runtime. -For now you're going to use what's called a "legacy converter" --- -a convenience syntax intended to make porting old code into Argument -Clinic easier. - -For each parameter, copy the "format unit" for that -parameter from the :c:func:`PyArg_Parse` format argument and -specify *that* as its converter, as a quoted string. -The "format unit" is the formal name for the one-to-three -character substring of the *format* parameter that tells -the argument parsing function what the type of the variable -is and how to convert it. -For more on format units please see :ref:`arg-parsing`. - -For multicharacter format units like ``z#``, -use the entire two-or-three character string. - -Sample:: - - /*[clinic input] - module _pickle - class _pickle.Pickler "PicklerObject *" "&Pickler_Type" - [clinic start generated code]*/ - - /*[clinic input] - _pickle.Pickler.dump - - obj: 'O' - - Write a pickled representation of obj to the open file. - [clinic start generated code]*/ - -If your function has ``|`` in the format string, -meaning some parameters have default values, you can ignore it. -Argument Clinic infers which parameters are optional -based on whether or not they have default values. - -If your function has ``$`` in the format string, -meaning it takes keyword-only arguments, -specify ``*`` on a line by itself before the first keyword-only argument, -indented the same as the parameter lines. - -:py:meth:`!_pickle.Pickler.dump` has neither, so our sample is unchanged. - -Next, if the existing C function calls :c:func:`PyArg_ParseTuple` -(as opposed to :c:func:`PyArg_ParseTupleAndKeywords`), then all its -arguments are positional-only. - -To mark parameters as positional-only in Argument Clinic, -add a ``/`` on a line by itself after the last positional-only parameter, -indented the same as the parameter lines. - -Sample:: - - /*[clinic input] - module _pickle - class _pickle.Pickler "PicklerObject *" "&Pickler_Type" - [clinic start generated code]*/ - - /*[clinic input] - _pickle.Pickler.dump - - obj: 'O' - / - - Write a pickled representation of obj to the open file. - [clinic start generated code]*/ - -It can be helpful to write a per-parameter docstring for each parameter. -Since per-parameter docstrings are optional, -you can skip this step if you prefer. - -Nevertheless, here's how to add a per-parameter docstring. -The first line of the per-parameter docstring -must be indented further than the parameter definition. -The left margin of this first line establishes -the left margin for the whole per-parameter docstring; -all the text you write will be outdented by this amount. -You can write as much text as you like, across multiple lines if you wish. - -Sample:: - - /*[clinic input] - module _pickle - class _pickle.Pickler "PicklerObject *" "&Pickler_Type" - [clinic start generated code]*/ - - /*[clinic input] - _pickle.Pickler.dump - - obj: 'O' - The object to be pickled. - / - - Write a pickled representation of obj to the open file. - [clinic start generated code]*/ - -Save and close the file, then run ``Tools/clinic/clinic.py`` on it. -With luck everything worked---your block now has output, -and a :file:`.c.h` file has been generated! -Reload the file in your text editor to see the generated code:: - - /*[clinic input] - _pickle.Pickler.dump - - obj: 'O' - The object to be pickled. - / - - Write a pickled representation of obj to the open file. - [clinic start generated code]*/ - - static PyObject * - _pickle_Pickler_dump(PicklerObject *self, PyObject *obj) - /*[clinic end generated code: output=87ecad1261e02ac7 input=552eb1c0f52260d9]*/ - -Obviously, if Argument Clinic didn't produce any output, -it's because it found an error in your input. -Keep fixing your errors and retrying until Argument Clinic processes your file -without complaint. - -For readability, most of the glue code has been generated to a :file:`.c.h` -file. You'll need to include that in your original :file:`.c` file, -typically right after the clinic module block:: - - #include "clinic/_pickle.c.h" - -Double-check that the argument-parsing code Argument Clinic generated -looks basically the same as the existing code. - -First, ensure both places use the same argument-parsing function. -The existing code must call either -:c:func:`PyArg_ParseTuple` or :c:func:`PyArg_ParseTupleAndKeywords`; -ensure that the code generated by Argument Clinic calls the -*exact* same function. - -Second, the format string passed in to :c:func:`!PyArg_ParseTuple` or -:c:func:`!PyArg_ParseTupleAndKeywords` should be *exactly* the same -as the hand-written one in the existing function, -up to the colon or semi-colon. - -Argument Clinic always generates its format strings -with a ``:`` followed by the name of the function. -If the existing code's format string ends with ``;``, -to provide usage help, this change is harmless --- don't worry about it. - -Third, for parameters whose format units require two arguments, -like a length variable, an encoding string, or a pointer -to a conversion function, ensure that the second argument is -*exactly* the same between the two invocations. - -Fourth, inside the output portion of the block, -you'll find a preprocessor macro defining the appropriate static -:c:type:`PyMethodDef` structure for this builtin:: - - #define __PICKLE_PICKLER_DUMP_METHODDEF \ - {"dump", (PyCFunction)__pickle_Pickler_dump, METH_O, __pickle_Pickler_dump__doc__}, - -This static structure should be *exactly* the same as the existing static -:c:type:`!PyMethodDef` structure for this builtin. - -If any of these items differ in *any way*, -adjust your Argument Clinic function specification and rerun -``Tools/clinic/clinic.py`` until they *are* the same. - -Notice that the last line of its output is the declaration -of your "impl" function. This is where the builtin's implementation goes. -Delete the existing prototype of the function you're modifying, but leave -the opening curly brace. Now delete its argument parsing code and the -declarations of all the variables it dumps the arguments into. -Notice how the Python arguments are now arguments to this impl function; -if the implementation used different names for these variables, fix it. - -Let's reiterate, just because it's kind of weird. -Your code should now look like this:: - - static return_type - your_function_impl(...) - /*[clinic end generated code: input=..., output=...]*/ - { - ... - -Argument Clinic generated the checksum line and the function prototype just -above it. You should write the opening and closing curly braces for the -function, and the implementation inside. - -Sample:: - - /*[clinic input] - module _pickle - class _pickle.Pickler "PicklerObject *" "&Pickler_Type" - [clinic start generated code]*/ - /*[clinic end generated code: checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709]*/ - - /*[clinic input] - _pickle.Pickler.dump - - obj: 'O' - The object to be pickled. - / - - Write a pickled representation of obj to the open file. - [clinic start generated code]*/ - - PyDoc_STRVAR(__pickle_Pickler_dump__doc__, - "Write a pickled representation of obj to the open file.\n" - "\n" - ... - static PyObject * - _pickle_Pickler_dump_impl(PicklerObject *self, PyObject *obj) - /*[clinic end generated code: checksum=3bd30745bf206a48f8b576a1da3d90f55a0a4187]*/ - { - /* Check whether the Pickler was initialized correctly (issue3664). - Developers often forget to call __init__() in their subclasses, which - would trigger a segfault without this check. */ - if (self->write == NULL) { - PyErr_Format(PicklingError, - "Pickler.__init__() was not called by %s.__init__()", - Py_TYPE(self)->tp_name); - return NULL; - } - - if (_Pickler_ClearBuffer(self) < 0) { - return NULL; - } - - ... - -Remember the macro with the :c:type:`PyMethodDef` structure for this function? -Find the existing :c:type:`!PyMethodDef` structure for this -function and replace it with a reference to the macro. If the builtin -is at module scope, this will probably be very near the end of the file; -if the builtin is a class method, this will probably be below but relatively -near to the implementation. - -Note that the body of the macro contains a trailing comma; when you -replace the existing static :c:type:`!PyMethodDef` structure with the macro, -*don't* add a comma to the end. - -Sample:: - - static struct PyMethodDef Pickler_methods[] = { - __PICKLE_PICKLER_DUMP_METHODDEF - __PICKLE_PICKLER_CLEAR_MEMO_METHODDEF - {NULL, NULL} /* sentinel */ - }; - -Finally, compile, then run the relevant portions of the regression-test suite. -This change should not introduce any new compile-time warnings or errors, -and there should be no externally visible change to Python's behavior, -except for one difference: :py:func:`inspect.signature` run on your function -should now provide a valid signature! - -Congratulations, you've ported your first function to work with Argument Clinic! - - -.. _clinic-howtos: - -How-to guides -============= - - -How to rename C functions and variables generated by Argument Clinic --------------------------------------------------------------------- - -Argument Clinic automatically names the functions it generates for you. -Occasionally this may cause a problem, if the generated name collides with -the name of an existing C function. There's an easy solution: override the names -used for the C functions. Just add the keyword ``"as"`` -to your function declaration line, followed by the function name you wish to use. -Argument Clinic will use that function name for the base (generated) function, -then add ``"_impl"`` to the end and use that for the name of the impl function. - -For example, if we wanted to rename the C function names generated for -:py:meth:`pickle.Pickler.dump`, it'd look like this:: - - /*[clinic input] - pickle.Pickler.dump as pickler_dumper - - ... - -The base function would now be named :c:func:`!pickler_dumper`, -and the impl function would now be named :c:func:`!pickler_dumper_impl`. - - -Similarly, you may have a problem where you want to give a parameter -a specific Python name, but that name may be inconvenient in C. Argument -Clinic allows you to give a parameter different names in Python and in C, -using the same ``"as"`` syntax:: - - /*[clinic input] - pickle.Pickler.dump - - obj: object - file as file_obj: object - protocol: object = NULL - * - fix_imports: bool = True - -Here, the name used in Python (in the signature and the ``keywords`` -array) would be *file*, but the C variable would be named ``file_obj``. - -You can use this to rename the *self* parameter too! - - -How to convert functions using ``PyArg_UnpackTuple`` ----------------------------------------------------- - -To convert a function parsing its arguments with :c:func:`PyArg_UnpackTuple`, -simply write out all the arguments, specifying each as an ``object``. You -may specify the *type* argument to cast the type as appropriate. All -arguments should be marked positional-only (add a ``/`` on a line by itself -after the last argument). - -Currently the generated code will use :c:func:`PyArg_ParseTuple`, but this -will change soon. - - -How to use optional groups --------------------------- - -Some legacy functions have a tricky approach to parsing their arguments: -they count the number of positional arguments, then use a ``switch`` statement -to call one of several different :c:func:`PyArg_ParseTuple` calls depending on -how many positional arguments there are. (These functions cannot accept -keyword-only arguments.) This approach was used to simulate optional -arguments back before :c:func:`PyArg_ParseTupleAndKeywords` was created. - -While functions using this approach can often be converted to -use :c:func:`!PyArg_ParseTupleAndKeywords`, optional arguments, and default values, -it's not always possible. Some of these legacy functions have -behaviors :c:func:`!PyArg_ParseTupleAndKeywords` doesn't directly support. -The most obvious example is the builtin function :py:func:`range`, which has -an optional argument on the *left* side of its required argument! -Another example is :py:meth:`curses.window.addch`, which has a group of two -arguments that must always be specified together. (The arguments are -called *x* and *y*; if you call the function passing in *x*, -you must also pass in *y* — and if you don't pass in *x* you may not -pass in *y* either.) - -In any case, the goal of Argument Clinic is to support argument parsing -for all existing CPython builtins without changing their semantics. -Therefore Argument Clinic supports -this alternate approach to parsing, using what are called *optional groups*. -Optional groups are groups of arguments that must all be passed in together. -They can be to the left or the right of the required arguments. They -can *only* be used with positional-only parameters. - -.. note:: Optional groups are *only* intended for use when converting - functions that make multiple calls to :c:func:`PyArg_ParseTuple`! - Functions that use *any* other approach for parsing arguments - should *almost never* be converted to Argument Clinic using - optional groups. Functions using optional groups currently - cannot have accurate signatures in Python, because Python just - doesn't understand the concept. Please avoid using optional - groups wherever possible. - -To specify an optional group, add a ``[`` on a line by itself before -the parameters you wish to group together, and a ``]`` on a line by itself -after these parameters. As an example, here's how :py:meth:`curses.window.addch` -uses optional groups to make the first two parameters and the last -parameter optional:: - - /*[clinic input] - - curses.window.addch - - [ - x: int - X-coordinate. - y: int - Y-coordinate. - ] - - ch: object - Character to add. - - [ - attr: long - Attributes for the character. - ] - / - - ... - - -Notes: - -* For every optional group, one additional parameter will be passed into the - impl function representing the group. The parameter will be an int named - ``group_{direction}_{number}``, - where ``{direction}`` is either ``right`` or ``left`` depending on whether the group - is before or after the required parameters, and ``{number}`` is a monotonically - increasing number (starting at 1) indicating how far away the group is from - the required parameters. When the impl is called, this parameter will be set - to zero if this group was unused, and set to non-zero if this group was used. - (By used or unused, I mean whether or not the parameters received arguments - in this invocation.) - -* If there are no required arguments, the optional groups will behave - as if they're to the right of the required arguments. - -* In the case of ambiguity, the argument parsing code - favors parameters on the left (before the required parameters). - -* Optional groups can only contain positional-only parameters. - -* Optional groups are *only* intended for legacy code. Please do not - use optional groups for new code. - - -How to use real Argument Clinic converters, instead of "legacy converters" --------------------------------------------------------------------------- - -To save time, and to minimize how much you need to learn -to achieve your first port to Argument Clinic, the walkthrough above tells -you to use "legacy converters". "Legacy converters" are a convenience, -designed explicitly to make porting existing code to Argument Clinic -easier. And to be clear, their use is acceptable when porting code for -Python 3.4. - -However, in the long term we probably want all our blocks to -use Argument Clinic's real syntax for converters. Why? A couple -reasons: - -* The proper converters are far easier to read and clearer in their intent. -* There are some format units that are unsupported as "legacy converters", - because they require arguments, and the legacy converter syntax doesn't - support specifying arguments. -* In the future we may have a new argument parsing library that isn't - restricted to what :c:func:`PyArg_ParseTuple` supports; this flexibility - won't be available to parameters using legacy converters. - -Therefore, if you don't mind a little extra effort, please use the normal -converters instead of legacy converters. - -In a nutshell, the syntax for Argument Clinic (non-legacy) converters -looks like a Python function call. However, if there are no explicit -arguments to the function (all functions take their default values), -you may omit the parentheses. Thus ``bool`` and ``bool()`` are exactly -the same converters. - -All arguments to Argument Clinic converters are keyword-only. -All Argument Clinic converters accept the following arguments: - - *c_default* - The default value for this parameter when defined in C. - Specifically, this will be the initializer for the variable declared - in the "parse function". See :ref:`the section on default values ` - for how to use this. - Specified as a string. - - *annotation* - The annotation value for this parameter. Not currently supported, - because :pep:`8` mandates that the Python library may not use - annotations. - -In addition, some converters accept additional arguments. Here is a list -of these arguments, along with their meanings: - - *accept* - A set of Python types (and possibly pseudo-types); - this restricts the allowable Python argument to values of these types. - (This is not a general-purpose facility; as a rule it only supports - specific lists of types as shown in the legacy converter table.) - - To accept ``None``, add ``NoneType`` to this set. - - *bitwise* - Only supported for unsigned integers. The native integer value of this - Python argument will be written to the parameter without any range checking, - even for negative values. - - *converter* - Only supported by the ``object`` converter. Specifies the name of a - :ref:`C "converter function" ` - to use to convert this object to a native type. - - *encoding* - Only supported for strings. Specifies the encoding to use when converting - this string from a Python str (Unicode) value into a C ``char *`` value. - - - *subclass_of* - Only supported for the ``object`` converter. Requires that the Python - value be a subclass of a Python type, as expressed in C. - - *type* - Only supported for the ``object`` and ``self`` converters. Specifies - the C type that will be used to declare the variable. Default value is - ``"PyObject *"``. - - *zeroes* - Only supported for strings. If true, embedded NUL bytes (``'\\0'``) are - permitted inside the value. The length of the string will be passed in - to the impl function, just after the string parameter, as a parameter named - ``_length``. - -Please note, not every possible combination of arguments will work. -Usually these arguments are implemented by specific :c:func:`PyArg_ParseTuple` -*format units*, with specific behavior. For example, currently you cannot -call ``unsigned_short`` without also specifying ``bitwise=True``. -Although it's perfectly reasonable to think this would work, these semantics don't -map to any existing format unit. So Argument Clinic doesn't support it. (Or, at -least, not yet.) - -Below is a table showing the mapping of legacy converters into real -Argument Clinic converters. On the left is the legacy converter, -on the right is the text you'd replace it with. - -========= ================================================================================= -``'B'`` ``unsigned_char(bitwise=True)`` -``'b'`` ``unsigned_char`` -``'c'`` ``char`` -``'C'`` ``int(accept={str})`` -``'d'`` ``double`` -``'D'`` ``Py_complex`` -``'es'`` ``str(encoding='name_of_encoding')`` -``'es#'`` ``str(encoding='name_of_encoding', zeroes=True)`` -``'et'`` ``str(encoding='name_of_encoding', accept={bytes, bytearray, str})`` -``'et#'`` ``str(encoding='name_of_encoding', accept={bytes, bytearray, str}, zeroes=True)`` -``'f'`` ``float`` -``'h'`` ``short`` -``'H'`` ``unsigned_short(bitwise=True)`` -``'i'`` ``int`` -``'I'`` ``unsigned_int(bitwise=True)`` -``'k'`` ``unsigned_long(bitwise=True)`` -``'K'`` ``unsigned_long_long(bitwise=True)`` -``'l'`` ``long`` -``'L'`` ``long long`` -``'n'`` ``Py_ssize_t`` -``'O'`` ``object`` -``'O!'`` ``object(subclass_of='&PySomething_Type')`` -``'O&'`` ``object(converter='name_of_c_function')`` -``'p'`` ``bool`` -``'S'`` ``PyBytesObject`` -``'s'`` ``str`` -``'s#'`` ``str(zeroes=True)`` -``'s*'`` ``Py_buffer(accept={buffer, str})`` -``'U'`` ``unicode`` -``'u'`` ``Py_UNICODE`` -``'u#'`` ``Py_UNICODE(zeroes=True)`` -``'w*'`` ``Py_buffer(accept={rwbuffer})`` -``'Y'`` ``PyByteArrayObject`` -``'y'`` ``str(accept={bytes})`` -``'y#'`` ``str(accept={robuffer}, zeroes=True)`` -``'y*'`` ``Py_buffer`` -``'Z'`` ``Py_UNICODE(accept={str, NoneType})`` -``'Z#'`` ``Py_UNICODE(accept={str, NoneType}, zeroes=True)`` -``'z'`` ``str(accept={str, NoneType})`` -``'z#'`` ``str(accept={str, NoneType}, zeroes=True)`` -``'z*'`` ``Py_buffer(accept={buffer, str, NoneType})`` -========= ================================================================================= - -As an example, here's our sample ``pickle.Pickler.dump`` using the proper -converter:: - - /*[clinic input] - pickle.Pickler.dump - - obj: object - The object to be pickled. - / - - Write a pickled representation of obj to the open file. - [clinic start generated code]*/ - -One advantage of real converters is that they're more flexible than legacy -converters. For example, the ``unsigned_int`` converter (and all the -``unsigned_`` converters) can be specified without ``bitwise=True``. Their -default behavior performs range checking on the value, and they won't accept -negative numbers. You just can't do that with a legacy converter! - -Argument Clinic will show you all the converters it has -available. For each converter it'll show you all the parameters -it accepts, along with the default value for each parameter. -Just run ``Tools/clinic/clinic.py --converters`` to see the full list. - - -How to use the ``Py_buffer`` converter --------------------------------------- - -When using the ``Py_buffer`` converter -(or the ``'s*'``, ``'w*'``, ``'*y'``, or ``'z*'`` legacy converters), -you *must* not call :c:func:`PyBuffer_Release` on the provided buffer. -Argument Clinic generates code that does it for you (in the parsing function). - - -.. _clinic-howto-advanced-converters: - -How to use advanced converters ------------------------------- - -Remember those format units you skipped for your first -time because they were advanced? Here's how to handle those too. - -The trick is, all those format units take arguments—either -conversion functions, or types, or strings specifying an encoding. -(But "legacy converters" don't support arguments. That's why we -skipped them for your first function.) The argument you specified -to the format unit is now an argument to the converter; this -argument is either *converter* (for ``O&``), *subclass_of* (for ``O!``), -or *encoding* (for all the format units that start with ``e``). - -When using *subclass_of*, you may also want to use the other -custom argument for ``object()``: *type*, which lets you set the type -actually used for the parameter. For example, if you want to ensure -that the object is a subclass of :c:var:`PyUnicode_Type`, you probably want -to use the converter ``object(type='PyUnicodeObject *', subclass_of='&PyUnicode_Type')``. - -One possible problem with using Argument Clinic: it takes away some possible -flexibility for the format units starting with ``e``. When writing a -:c:func:`!PyArg_Parse*` call by hand, you could theoretically decide at runtime what -encoding string to pass to that call. But now this string must -be hard-coded at Argument-Clinic-preprocessing-time. This limitation is deliberate; -it made supporting this format unit much easier, and may allow for future optimizations. -This restriction doesn't seem unreasonable; CPython itself always passes in static -hard-coded encoding strings for parameters whose format units start with ``e``. - - -.. _clinic-howto-default-values: -.. _default_values: - -How to assign default values to parameter ------------------------------------------ - -Default values for parameters can be any of a number of values. -At their simplest, they can be string, int, or float literals: - -.. code-block:: none - - foo: str = "abc" - bar: int = 123 - bat: float = 45.6 - -They can also use any of Python's built-in constants: - -.. code-block:: none - - yep: bool = True - nope: bool = False - nada: object = None - -There's also special support for a default value of ``NULL``, and -for simple expressions, documented in the following sections. - - -The ``NULL`` default value -^^^^^^^^^^^^^^^^^^^^^^^^^^ - -For string and object parameters, you can set them to ``None`` to indicate -that there's no default. However, that means the C variable will be -initialized to ``Py_None``. For convenience's sakes, there's a special -value called ``NULL`` for just this reason: from Python's perspective it -behaves like a default value of ``None``, but the C variable is initialized -with ``NULL``. - - -Symbolic default values -^^^^^^^^^^^^^^^^^^^^^^^ - -The default value you provide for a parameter can't be any arbitrary -expression. Currently the following are explicitly supported: - -* Numeric constants (integer and float) -* String constants -* ``True``, ``False``, and ``None`` -* Simple symbolic constants like :py:data:`sys.maxsize`, which must - start with the name of the module - -(In the future, this may need to get even more elaborate, -to allow full expressions like ``CONSTANT - 1``.) - - -Expressions as default values -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -The default value for a parameter can be more than just a literal value. -It can be an entire expression, using math operators and looking up attributes -on objects. However, this support isn't exactly simple, because of some -non-obvious semantics. - -Consider the following example: - -.. code-block:: none - - foo: Py_ssize_t = sys.maxsize - 1 - -:py:data:`sys.maxsize` can have different values on different platforms. Therefore -Argument Clinic can't simply evaluate that expression locally and hard-code it -in C. So it stores the default in such a way that it will get evaluated at -runtime, when the user asks for the function's signature. - -What namespace is available when the expression is evaluated? It's evaluated -in the context of the module the builtin came from. So, if your module has an -attribute called :py:attr:`!max_widgets`, you may simply use it: - -.. code-block:: none - - foo: Py_ssize_t = max_widgets - -If the symbol isn't found in the current module, it fails over to looking in -:py:data:`sys.modules`. That's how it can find :py:data:`sys.maxsize` for example. -(Since you don't know in advance what modules the user will load into their interpreter, -it's best to restrict yourself to modules that are preloaded by Python itself.) - -Evaluating default values only at runtime means Argument Clinic can't compute -the correct equivalent C default value. So you need to tell it explicitly. -When you use an expression, you must also specify the equivalent expression -in C, using the *c_default* parameter to the converter: - -.. code-block:: none - - foo: Py_ssize_t(c_default="PY_SSIZE_T_MAX - 1") = sys.maxsize - 1 - -Another complication: Argument Clinic can't know in advance whether or not the -expression you supply is valid. It parses it to make sure it looks legal, but -it can't *actually* know. You must be very careful when using expressions to -specify values that are guaranteed to be valid at runtime! - -Finally, because expressions must be representable as static C values, there -are many restrictions on legal expressions. Here's a list of Python features -you're not permitted to use: - -* Function calls. -* Inline if statements (``3 if foo else 5``). -* Automatic sequence unpacking (``*[1, 2, 3]``). -* List/set/dict comprehensions and generator expressions. -* Tuple/list/set/dict literals. - - -.. _clinic-howto-return-converters: - -How to use return converters ----------------------------- - -By default, the impl function Argument Clinic generates for you returns -:c:type:`PyObject * `. -But your C function often computes some C type, -then converts it into the :c:type:`!PyObject *` -at the last moment. Argument Clinic handles converting your inputs from Python types -into native C types—why not have it convert your return value from a native C type -into a Python type too? - -That's what a "return converter" does. It changes your impl function to return -some C type, then adds code to the generated (non-impl) function to handle converting -that value into the appropriate :c:type:`!PyObject *`. - -The syntax for return converters is similar to that of parameter converters. -You specify the return converter like it was a return annotation on the -function itself, using ``->`` notation. - -For example: - -.. code-block:: c - - /*[clinic input] - add -> int - - a: int - b: int - / - - [clinic start generated code]*/ - -Return converters behave much the same as parameter converters; -they take arguments, the arguments are all keyword-only, and if you're not changing -any of the default arguments you can omit the parentheses. - -(If you use both ``"as"`` *and* a return converter for your function, -the ``"as"`` should come before the return converter.) - -There's one additional complication when using return converters: how do you -indicate an error has occurred? Normally, a function returns a valid (non-``NULL``) -pointer for success, and ``NULL`` for failure. But if you use an integer return converter, -all integers are valid. How can Argument Clinic detect an error? Its solution: each return -converter implicitly looks for a special value that indicates an error. If you return -that value, and an error has been set (c:func:`PyErr_Occurred` returns a true -value), then the generated code will propagate the error. Otherwise it will -encode the value you return like normal. - -Currently Argument Clinic supports only a few return converters: - -.. code-block:: none - - bool - double - float - int - long - Py_ssize_t - size_t - unsigned int - unsigned long - -None of these take parameters. -For all of these, return ``-1`` to indicate error. - -(There's also an experimental ``NoneType`` converter, which lets you -return ``Py_None`` on success or ``NULL`` on failure, without having -to increment the reference count on ``Py_None``. I'm not sure it adds -enough clarity to be worth using.) - -To see all the return converters Argument Clinic supports, along with -their parameters (if any), -just run ``Tools/clinic/clinic.py --converters`` for the full list. - - -How to clone existing functions -------------------------------- - -If you have a number of functions that look similar, you may be able to -use Clinic's "clone" feature. When you clone an existing function, -you reuse: - -* its parameters, including - - * their names, - - * their converters, with all parameters, - - * their default values, - - * their per-parameter docstrings, - - * their *kind* (whether they're positional only, - positional or keyword, or keyword only), and - -* its return converter. - -The only thing not copied from the original function is its docstring; -the syntax allows you to specify a new docstring. - -Here's the syntax for cloning a function:: - - /*[clinic input] - module.class.new_function [as c_basename] = module.class.existing_function - - Docstring for new_function goes here. - [clinic start generated code]*/ - -(The functions can be in different modules or classes. I wrote -``module.class`` in the sample just to illustrate that you must -use the full path to *both* functions.) - -Sorry, there's no syntax for partially cloning a function, or cloning a function -then modifying it. Cloning is an all-or nothing proposition. - -Also, the function you are cloning from must have been previously defined -in the current file. - - -How to call Python code ------------------------ - -The rest of the advanced topics require you to write Python code -which lives inside your C file and modifies Argument Clinic's -runtime state. This is simple: you simply define a Python block. - -A Python block uses different delimiter lines than an Argument -Clinic function block. It looks like this:: - - /*[python input] - # python code goes here - [python start generated code]*/ - -All the code inside the Python block is executed at the -time it's parsed. All text written to stdout inside the block -is redirected into the "output" after the block. - -As an example, here's a Python block that adds a static integer -variable to the C code:: - - /*[python input] - print('static int __ignored_unused_variable__ = 0;') - [python start generated code]*/ - static int __ignored_unused_variable__ = 0; - /*[python checksum:...]*/ - - -.. _clinic-howto-self-converter: - -How to use the "self converter" -------------------------------- - -Argument Clinic automatically adds a "self" parameter for you -using a default converter. It automatically sets the ``type`` -of this parameter to the "pointer to an instance" you specified -when you declared the type. However, you can override -Argument Clinic's converter and specify one yourself. -Just add your own *self* parameter as the first parameter in a -block, and ensure that its converter is an instance of -:class:`!self_converter` or a subclass thereof. - -What's the point? This lets you override the type of ``self``, -or give it a different default name. - -How do you specify the custom type you want to cast ``self`` to? -If you only have one or two functions with the same type for ``self``, -you can directly use Argument Clinic's existing ``self`` converter, -passing in the type you want to use as the *type* parameter:: - - /*[clinic input] - - _pickle.Pickler.dump - - self: self(type="PicklerObject *") - obj: object - / - - Write a pickled representation of the given object to the open file. - [clinic start generated code]*/ - -On the other hand, if you have a lot of functions that will use the same -type for ``self``, it's best to create your own converter, subclassing -:class:`!self_converter` but overwriting the :py:attr:`!type` member:: - - /*[python input] - class PicklerObject_converter(self_converter): - type = "PicklerObject *" - [python start generated code]*/ - - /*[clinic input] - - _pickle.Pickler.dump - - self: PicklerObject - obj: object - / - - Write a pickled representation of the given object to the open file. - [clinic start generated code]*/ - - -How to use the "defining class" converter ------------------------------------------ - -Argument Clinic facilitates gaining access to the defining class of a method. -This is useful for :ref:`heap type ` methods that need to fetch -module level state. Use :c:func:`PyType_FromModuleAndSpec` to associate a new -heap type with a module. You can now use :c:func:`PyType_GetModuleState` on -the defining class to fetch the module state, for example from a module method. - -Example from :source:`Modules/zlibmodule.c`. -First, ``defining_class`` is added to the clinic input:: - - /*[clinic input] - zlib.Compress.compress - - cls: defining_class - data: Py_buffer - Binary data to be compressed. - / - - -After running the Argument Clinic tool, the following function signature is -generated:: - - /*[clinic start generated code]*/ - static PyObject * - zlib_Compress_compress_impl(compobject *self, PyTypeObject *cls, - Py_buffer *data) - /*[clinic end generated code: output=6731b3f0ff357ca6 input=04d00f65ab01d260]*/ - - -The following code can now use ``PyType_GetModuleState(cls)`` to fetch the -module state:: - - zlibstate *state = PyType_GetModuleState(cls); - - -Each method may only have one argument using this converter, and it must appear -after ``self``, or, if ``self`` is not used, as the first argument. The argument -will be of type ``PyTypeObject *``. The argument will not appear in the -:py:attr:`!__text_signature__`. - -The ``defining_class`` converter is not compatible with :py:meth:`!__init__` -and :py:meth:`!__new__` methods, which cannot use the :c:macro:`METH_METHOD` -convention. - -It is not possible to use ``defining_class`` with slot methods. In order to -fetch the module state from such methods, use :c:func:`PyType_GetModuleByDef` -to look up the module and then :c:func:`PyModule_GetState` to fetch the module -state. Example from the ``setattro`` slot method in -:source:`Modules/_threadmodule.c`:: - - static int - local_setattro(localobject *self, PyObject *name, PyObject *v) - { - PyObject *module = PyType_GetModuleByDef(Py_TYPE(self), &thread_module); - thread_module_state *state = get_thread_state(module); - ... - } - - -See also :pep:`573`. - - -.. _clinic-howto-custom-converter: - -How to write a custom converter -------------------------------- - -A converter is a Python class that inherits from :py:class:`CConverter`. -The main purpose of a custom converter, is for parameters parsed with -the ``O&`` format unit --- parsing such a parameter means calling -a :c:func:`PyArg_ParseTuple` "converter function". - -Your converter class should be named :samp:`{ConverterName}_converter`. -By following this convention, your converter class will be automatically -registered with Argument Clinic, with its *converter name* being the name of -your converter class with the ``_converter`` suffix stripped off. - -Instead of subclassing :py:meth:`!CConverter.__init__`, -write a :py:meth:`!converter_init` method. -:py:meth:`!converter_init` always accepts a *self* parameter. -After *self*, all additional parameters **must** be keyword-only. -Any arguments passed to the converter in Argument Clinic -will be passed along to your :py:meth:`!converter_init` method. -See :py:class:`CConverter` for a list of members you may wish to specify in -your subclass. - -Here's the simplest example of a custom converter, from :source:`Modules/zlibmodule.c`:: - - /*[python input] - - class ssize_t_converter(CConverter): - type = 'Py_ssize_t' - converter = 'ssize_t_converter' - - [python start generated code]*/ - /*[python end generated code: output=da39a3ee5e6b4b0d input=35521e4e733823c7]*/ - -This block adds a converter named ``ssize_t`` to Argument Clinic. -Parameters declared as ``ssize_t`` will be declared with type :c:type:`Py_ssize_t`, -and will be parsed by the ``'O&'`` format unit, -which will call the :c:func:`!ssize_t_converter` converter C function. -``ssize_t`` variables automatically support default values. - -More sophisticated custom converters can insert custom C code to -handle initialization and cleanup. -You can see more examples of custom converters in the CPython -source tree; grep the C files for the string ``CConverter``. - - -How to write a custom return converter --------------------------------------- - -Writing a custom return converter is much like writing -a custom converter. Except it's somewhat simpler, because return -converters are themselves much simpler. - -Return converters must subclass :py:class:`!CReturnConverter`. -There are no examples yet of custom return converters, -because they are not widely used yet. If you wish to -write your own return converter, please read :source:`Tools/clinic/clinic.py`, -specifically the implementation of :py:class:`!CReturnConverter` and -all its subclasses. - - -How to convert ``METH_O`` and ``METH_NOARGS`` functions -------------------------------------------------------- - -To convert a function using :c:macro:`METH_O`, make sure the function's -single argument is using the ``object`` converter, and mark the -arguments as positional-only:: - - /*[clinic input] - meth_o_sample - - argument: object - / - [clinic start generated code]*/ - - -To convert a function using :c:macro:`METH_NOARGS`, just don't specify -any arguments. - -You can still use a self converter, a return converter, and specify -a *type* argument to the object converter for :c:macro:`METH_O`. - - -How to convert ``tp_new`` and ``tp_init`` functions ---------------------------------------------------- - -You can convert :c:member:`~PyTypeObject.tp_new` and -:c:member:`~PyTypeObject.tp_init` functions. -Just name them ``__new__`` or ``__init__`` as appropriate. Notes: - -* The function name generated for ``__new__`` doesn't end in ``__new__`` - like it would by default. It's just the name of the class, converted - into a valid C identifier. - -* No :c:type:`PyMethodDef` ``#define`` is generated for these functions. - -* ``__init__`` functions return ``int``, not ``PyObject *``. - -* Use the docstring as the class docstring. - -* Although ``__new__`` and ``__init__`` functions must always - accept both the ``args`` and ``kwargs`` objects, when converting - you may specify any signature for these functions that you like. - (If your function doesn't support keywords, the parsing function - generated will throw an exception if it receives any.) - - -How to change and redirect Clinic's output ------------------------------------------- - -It can be inconvenient to have Clinic's output interspersed with -your conventional hand-edited C code. Luckily, Clinic is configurable: -you can buffer up its output for printing later (or earlier!), or write -its output to a separate file. You can also add a prefix or suffix to -every line of Clinic's generated output. - -While changing Clinic's output in this manner can be a boon to readability, -it may result in Clinic code using types before they are defined, or -your code attempting to use Clinic-generated code before it is defined. -These problems can be easily solved by rearranging the declarations in your file, -or moving where Clinic's generated code goes. (This is why the default behavior -of Clinic is to output everything into the current block; while many people -consider this hampers readability, it will never require rearranging your -code to fix definition-before-use problems.) - -Let's start with defining some terminology: - -*field* - A field, in this context, is a subsection of Clinic's output. - For example, the ``#define`` for the :c:type:`PyMethodDef` structure - is a field, called ``methoddef_define``. Clinic has seven - different fields it can output per function definition: - - .. code-block:: none - - docstring_prototype - docstring_definition - methoddef_define - impl_prototype - parser_prototype - parser_definition - impl_definition - - All the names are of the form ``"_"``, - where ``""`` is the semantic object represented (the parsing function, - the impl function, the docstring, or the methoddef structure) and ``""`` - represents what kind of statement the field is. Field names that end in - ``"_prototype"`` - represent forward declarations of that thing, without the actual body/data - of the thing; field names that end in ``"_definition"`` represent the actual - definition of the thing, with the body/data of the thing. (``"methoddef"`` - is special, it's the only one that ends with ``"_define"``, representing that - it's a preprocessor #define.) - -*destination* - A destination is a place Clinic can write output to. There are - five built-in destinations: - - ``block`` - The default destination: printed in the output section of - the current Clinic block. - - ``buffer`` - A text buffer where you can save text for later. Text sent - here is appended to the end of any existing text. It's an - error to have any text left in the buffer when Clinic finishes - processing a file. - - ``file`` - A separate "clinic file" that will be created automatically by Clinic. - The filename chosen for the file is ``{basename}.clinic{extension}``, - where ``basename`` and ``extension`` were assigned the output - from ``os.path.splitext()`` run on the current file. (Example: - the ``file`` destination for :file:`_pickle.c` would be written to - :file:`_pickle.clinic.c`.) - - **Important: When using a** ``file`` **destination, you** - *must check in* **the generated file!** - - ``two-pass`` - A buffer like ``buffer``. However, a two-pass buffer can only - be dumped once, and it prints out all text sent to it during - all processing, even from Clinic blocks *after* the dumping point. - - ``suppress`` - The text is suppressed—thrown away. - - -Clinic defines five new directives that let you reconfigure its output. - -The first new directive is ``dump``: - -.. code-block:: none - - dump - -This dumps the current contents of the named destination into the output of -the current block, and empties it. This only works with ``buffer`` and -``two-pass`` destinations. - -The second new directive is ``output``. The most basic form of ``output`` -is like this: - -.. code-block:: none - - output - -This tells Clinic to output *field* to *destination*. ``output`` also -supports a special meta-destination, called ``everything``, which tells -Clinic to output *all* fields to that *destination*. - -``output`` has a number of other functions: - -.. code-block:: none - - output push - output pop - output preset - - -``output push`` and ``output pop`` allow you to push and pop -configurations on an internal configuration stack, so that you -can temporarily modify the output configuration, then easily restore -the previous configuration. Simply push before your change to save -the current configuration, then pop when you wish to restore the -previous configuration. - -``output preset`` sets Clinic's output to one of several built-in -preset configurations, as follows: - - ``block`` - Clinic's original starting configuration. Writes everything - immediately after the input block. - - Suppress the ``parser_prototype`` - and ``docstring_prototype``, write everything else to ``block``. - - ``file`` - Designed to write everything to the "clinic file" that it can. - You then ``#include`` this file near the top of your file. - You may need to rearrange your file to make this work, though - usually this just means creating forward declarations for various - ``typedef`` and ``PyTypeObject`` definitions. - - Suppress the ``parser_prototype`` - and ``docstring_prototype``, write the ``impl_definition`` to - ``block``, and write everything else to ``file``. - - The default filename is ``"{dirname}/clinic/{basename}.h"``. - - ``buffer`` - Save up most of the output from Clinic, to be written into - your file near the end. For Python files implementing modules - or builtin types, it's recommended that you dump the buffer - just above the static structures for your module or - builtin type; these are normally very near the end. Using - ``buffer`` may require even more editing than ``file``, if - your file has static ``PyMethodDef`` arrays defined in the - middle of the file. - - Suppress the ``parser_prototype``, ``impl_prototype``, - and ``docstring_prototype``, write the ``impl_definition`` to - ``block``, and write everything else to ``file``. - - ``two-pass`` - Similar to the ``buffer`` preset, but writes forward declarations to - the ``two-pass`` buffer, and definitions to the ``buffer``. - This is similar to the ``buffer`` preset, but may require - less editing than ``buffer``. Dump the ``two-pass`` buffer - near the top of your file, and dump the ``buffer`` near - the end just like you would when using the ``buffer`` preset. - - Suppresses the ``impl_prototype``, write the ``impl_definition`` - to ``block``, write ``docstring_prototype``, ``methoddef_define``, - and ``parser_prototype`` to ``two-pass``, write everything else - to ``buffer``. - - ``partial-buffer`` - Similar to the ``buffer`` preset, but writes more things to ``block``, - only writing the really big chunks of generated code to ``buffer``. - This avoids the definition-before-use problem of ``buffer`` completely, - at the small cost of having slightly more stuff in the block's output. - Dump the ``buffer`` near the end, just like you would when using - the ``buffer`` preset. - - Suppresses the ``impl_prototype``, write the ``docstring_definition`` - and ``parser_definition`` to ``buffer``, write everything else to ``block``. - -The third new directive is ``destination``: - -.. code-block:: none - - destination [...] - -This performs an operation on the destination named ``name``. - -There are two defined subcommands: ``new`` and ``clear``. - -The ``new`` subcommand works like this: - -.. code-block:: none - - destination new - -This creates a new destination with name ```` and type ````. - -There are five destination types: - - ``suppress`` - Throws the text away. - - ``block`` - Writes the text to the current block. This is what Clinic - originally did. - - ``buffer`` - A simple text buffer, like the "buffer" builtin destination above. - - ``file`` - A text file. The file destination takes an extra argument, - a template to use for building the filename, like so: - - destination new - - The template can use three strings internally that will be replaced - by bits of the filename: - - {path} - The full path to the file, including directory and full filename. - {dirname} - The name of the directory the file is in. - {basename} - Just the name of the file, not including the directory. - {basename_root} - Basename with the extension clipped off - (everything up to but not including the last '.'). - {basename_extension} - The last '.' and everything after it. If the basename - does not contain a period, this will be the empty string. - - If there are no periods in the filename, {basename} and {filename} - are the same, and {extension} is empty. "{basename}{extension}" - is always exactly the same as "{filename}"." - - ``two-pass`` - A two-pass buffer, like the "two-pass" builtin destination above. - - -The ``clear`` subcommand works like this: - -.. code-block:: none - - destination clear - -It removes all the accumulated text up to this point in the destination. -(I don't know what you'd need this for, but I thought maybe it'd be -useful while someone's experimenting.) - -The fourth new directive is ``set``: - -.. code-block:: none - - set line_prefix "string" - set line_suffix "string" - -``set`` lets you set two internal variables in Clinic. -``line_prefix`` is a string that will be prepended to every line of Clinic's output; -``line_suffix`` is a string that will be appended to every line of Clinic's output. - -Both of these support two format strings: - - ``{block comment start}`` - Turns into the string ``/*``, the start-comment text sequence for C files. - - ``{block comment end}`` - Turns into the string ``*/``, the end-comment text sequence for C files. - -The final new directive is one you shouldn't need to use directly, -called ``preserve``: - -.. code-block:: none - - preserve - -This tells Clinic that the current contents of the output should be kept, unmodified. -This is used internally by Clinic when dumping output into ``file`` files; wrapping -it in a Clinic block lets Clinic use its existing checksum functionality to ensure -the file was not modified by hand before it gets overwritten. - - -How to use the ``#ifdef`` trick -------------------------------- - -If you're converting a function that isn't available on all platforms, -there's a trick you can use to make life a little easier. The existing -code probably looks like this:: - - #ifdef HAVE_FUNCTIONNAME - static module_functionname(...) - { - ... - } - #endif /* HAVE_FUNCTIONNAME */ - -And then in the ``PyMethodDef`` structure at the bottom the existing code -will have: - -.. code-block:: none - - #ifdef HAVE_FUNCTIONNAME - {'functionname', ... }, - #endif /* HAVE_FUNCTIONNAME */ - -In this scenario, you should enclose the body of your impl function inside the ``#ifdef``, -like so:: - - #ifdef HAVE_FUNCTIONNAME - /*[clinic input] - module.functionname - ... - [clinic start generated code]*/ - static module_functionname(...) - { - ... - } - #endif /* HAVE_FUNCTIONNAME */ - -Then, remove those three lines from the :c:type:`PyMethodDef` structure, -replacing them with the macro Argument Clinic generated: - -.. code-block:: none - - MODULE_FUNCTIONNAME_METHODDEF - -(You can find the real name for this macro inside the generated code. -Or you can calculate it yourself: it's the name of your function as defined -on the first line of your block, but with periods changed to underscores, -uppercased, and ``"_METHODDEF"`` added to the end.) - -Perhaps you're wondering: what if ``HAVE_FUNCTIONNAME`` isn't defined? -The ``MODULE_FUNCTIONNAME_METHODDEF`` macro won't be defined either! - -Here's where Argument Clinic gets very clever. It actually detects that the -Argument Clinic block might be deactivated by the ``#ifdef``. When that -happens, it generates a little extra code that looks like this:: - - #ifndef MODULE_FUNCTIONNAME_METHODDEF - #define MODULE_FUNCTIONNAME_METHODDEF - #endif /* !defined(MODULE_FUNCTIONNAME_METHODDEF) */ - -That means the macro always works. If the function is defined, this turns -into the correct structure, including the trailing comma. If the function is -undefined, this turns into nothing. - -However, this causes one ticklish problem: where should Argument Clinic put this -extra code when using the "block" output preset? It can't go in the output block, -because that could be deactivated by the ``#ifdef``. (That's the whole point!) - -In this situation, Argument Clinic writes the extra code to the "buffer" destination. -This may mean that you get a complaint from Argument Clinic: - -.. code-block:: none - - Warning in file "Modules/posixmodule.c" on line 12357: - Destination buffer 'buffer' not empty at end of file, emptying. - -When this happens, just open your file, find the ``dump buffer`` block that -Argument Clinic added to your file (it'll be at the very bottom), then -move it above the :c:type:`PyMethodDef` structure where that macro is used. - - -How to use Argument Clinic in Python files ------------------------------------------- - -It's actually possible to use Argument Clinic to preprocess Python files. -There's no point to using Argument Clinic blocks, of course, as the output -wouldn't make any sense to the Python interpreter. But using Argument Clinic -to run Python blocks lets you use Python as a Python preprocessor! - -Since Python comments are different from C comments, Argument Clinic -blocks embedded in Python files look slightly different. They look like this: - -.. code-block:: python3 - - #/*[python input] - #print("def foo(): pass") - #[python start generated code]*/ - def foo(): pass - #/*[python checksum:...]*/ - - -.. _clinic-howto-override-signature: - -How to override the generated signature ---------------------------------------- - -You can use the ``@text_signature`` directive to override the default generated -signature in the docstring. -This can be useful for complex signatures that Argument Clinic cannot handle. -The ``@text_signature`` directive takes one argument: -the custom signature as a string. -The provided signature is copied verbatim to the generated docstring. - -Example from :source:`Objects/codeobject.c`:: - - /*[clinic input] - @text_signature "($self, /, **changes)" - code.replace - * - co_argcount: int(c_default="self->co_argcount") = unchanged - co_posonlyargcount: int(c_default="self->co_posonlyargcount") = unchanged - # etc ... - - Return a copy of the code object with new values for the specified fields. - [clinic start generated output]*/ - -The generated docstring ends up looking like this: - -.. code-block:: none - - replace($self, /, **changes) - -- - - Return a copy of the code object with new values for the specified fields. + The Argument Clinic How-TO has been moved to the `Python Developer's Guide + `__. diff --git a/Doc/howto/index.rst b/Doc/howto/index.rst index 8a378e6659efc4..c53c613bc5b1c6 100644 --- a/Doc/howto/index.rst +++ b/Doc/howto/index.rst @@ -28,7 +28,6 @@ Currently, the HOWTOs are: urllib2.rst argparse.rst ipaddress.rst - clinic.rst instrumentation.rst annotations.rst isolating-extensions.rst diff --git a/Doc/whatsnew/3.8.rst b/Doc/whatsnew/3.8.rst index b73bf14bcbf192..3789164cc1f72e 100644 --- a/Doc/whatsnew/3.8.rst +++ b/Doc/whatsnew/3.8.rst @@ -123,7 +123,7 @@ There is a new function parameter syntax ``/`` to indicate that some function parameters must be specified positionally and cannot be used as keyword arguments. This is the same notation shown by ``help()`` for C functions annotated with Larry Hastings' -:ref:`Argument Clinic ` tool. +`Argument Clinic `__ tool. In the following example, parameters *a* and *b* are positional-only, while *c* or *d* can be positional or keyword, and *e* or *f* are diff --git a/Misc/NEWS.d/3.11.5.rst b/Misc/NEWS.d/3.11.5.rst index 502e0ccfd68c46..a26e3937418b95 100644 --- a/Misc/NEWS.d/3.11.5.rst +++ b/Misc/NEWS.d/3.11.5.rst @@ -793,7 +793,7 @@ and 3.1.2. Argument Clinic now supports overriding automatically generated signature by using directive ``@text_signature``. See -:ref:`clinic-howto-override-signature`. +`the Argument Clinic docs `__. .. From c9214b90f469aa85e307a6dc3b6052c961003807 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 12 Oct 2023 11:57:36 +0200 Subject: [PATCH 070/265] [3.11] gh-107450: Raise OverflowError when parser column offset overflows (GH-110754) (#110763) (cherry picked from commit fb7843ee895ac7f6eeb58f356b1a320eea081cfc) Co-authored-by: Lysandros Nikolaou --- Lib/test/test_exceptions.py | 4 ++++ Parser/pegen_errors.c | 6 ++++++ 2 files changed, 10 insertions(+) diff --git a/Lib/test/test_exceptions.py b/Lib/test/test_exceptions.py index 6f34e29e42f7c7..7f6eaa34e11406 100644 --- a/Lib/test/test_exceptions.py +++ b/Lib/test/test_exceptions.py @@ -318,6 +318,10 @@ def baz(): check('(yield i) = 2', 1, 2) check('def f(*):\n pass', 1, 7) + def testMemoryErrorBigSource(self): + with self.assertRaisesRegex(OverflowError, "column offset overflow"): + exec(f"if True:\n {' ' * 2**31}print('hello world')") + @cpython_only def testSettingException(self): # test that setting an exception at the C level works even if the diff --git a/Parser/pegen_errors.c b/Parser/pegen_errors.c index 3d8cccb0a9749c..ea5c4e227ff763 100644 --- a/Parser/pegen_errors.c +++ b/Parser/pegen_errors.c @@ -224,6 +224,12 @@ _PyPegen_raise_error(Parser *p, PyObject *errtype, const char *errmsg, ...) col_offset = 0; } else { const char* start = p->tok->buf ? p->tok->line_start : p->tok->buf; + if (p->tok->cur - start > INT_MAX) { + PyErr_SetString(PyExc_OverflowError, + "Parser column offset overflow - source line is too big"); + p->error_indicator = 1; + return NULL; + } col_offset = Py_SAFE_DOWNCAST(p->tok->cur - start, intptr_t, int); } } else { From e16922f07010a620298d7fb8abfae5c578d5d337 Mon Sep 17 00:00:00 2001 From: Nikita Sobolev Date: Thu, 12 Oct 2023 15:52:03 +0300 Subject: [PATCH 071/265] [3.11] gh-109216: Fix possible memory leak in `BUILD_MAP` (#109323) * [3.11] gh-109216: Fix possible memory leak in `BUILD_MAP` * Add NEWS * Update Python/ceval.c Co-authored-by: Kumar Aditya --------- Co-authored-by: Kumar Aditya --- .../2023-09-11-12-41-42.gh-issue-109216.60QOSb.rst | 2 ++ Python/ceval.c | 5 +++-- 2 files changed, 5 insertions(+), 2 deletions(-) create mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-09-11-12-41-42.gh-issue-109216.60QOSb.rst diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-09-11-12-41-42.gh-issue-109216.60QOSb.rst b/Misc/NEWS.d/next/Core and Builtins/2023-09-11-12-41-42.gh-issue-109216.60QOSb.rst new file mode 100644 index 00000000000000..f36310fc5f8064 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2023-09-11-12-41-42.gh-issue-109216.60QOSb.rst @@ -0,0 +1,2 @@ +Fix possible memory leak in :opcode:`BUILD_MAP`. + diff --git a/Python/ceval.c b/Python/ceval.c index 172bc4743bda90..bb6bb35030d212 100644 --- a/Python/ceval.c +++ b/Python/ceval.c @@ -3318,13 +3318,14 @@ _PyEval_EvalFrameDefault(PyThreadState *tstate, _PyInterpreterFrame *frame, int &PEEK(2*oparg), 2, &PEEK(2*oparg - 1), 2, oparg); - if (map == NULL) - goto error; while (oparg--) { Py_DECREF(POP()); Py_DECREF(POP()); } + if (map == NULL) { + goto error; + } PUSH(map); DISPATCH(); } From 26748ed4f61520c59af15547792d1e73144a4314 Mon Sep 17 00:00:00 2001 From: Victor Stinner Date: Thu, 12 Oct 2023 23:45:36 +0200 Subject: [PATCH 072/265] gh-110756: Sync regrtest with main branch (#110758) (#110781) Copy files from main to this branch: * Lib/test/libregrtest/*.py * Lib/test/__init__.py * Lib/test/__main__.py * Lib/test/autotest.py * Lib/test/pythoninfo.py * Lib/test/regrtest.py * Lib/test/test_regrtest.py Copy also changes from: * Lib/test/support/__init__.py * Lib/test/support/os_helper.py * Lib/test/support/testresult.py * Lib/test/support/threading_helper.py * Lib/test/test_support.py Do not modify scripts running tests such as Makefile.pre.in, .github/workflows/build.yml or Tools/scripts/run_tests.py: do not use --fast-ci and --slow-ci in this change. Changes: * SPLITTESTDIRS: don't include test_inspect. * Add utils.process_cpu_count() using len(os.sched_getaffinity(0)). * test_regrtest doesn't use @support.without_optimizer which doesn't exist in Python 3.11. * Add support.set_sanitizer_env_var(). * Update test_faulthandler to use support.set_sanitizer_env_var(). * @support.without_optimizer doesn't exist in 3.11. * Add support.Py_DEBUG. * regrtest.refleak: 3.11 doesn't have sys.getunicodeinternedsize. --- Lib/test/__main__.py | 4 +- Lib/test/autotest.py | 2 +- Lib/test/libregrtest/__init__.py | 2 - Lib/test/libregrtest/cmdline.py | 125 ++- Lib/test/libregrtest/findtests.py | 105 +++ Lib/test/libregrtest/logger.py | 86 ++ Lib/test/libregrtest/main.py | 1199 +++++++++++--------------- Lib/test/libregrtest/pgo.py | 8 +- Lib/test/libregrtest/refleak.py | 35 +- Lib/test/libregrtest/result.py | 190 ++++ Lib/test/libregrtest/results.py | 261 ++++++ Lib/test/libregrtest/run_workers.py | 607 +++++++++++++ Lib/test/libregrtest/runtest.py | 479 ---------- Lib/test/libregrtest/runtest_mp.py | 564 ------------ Lib/test/libregrtest/runtests.py | 162 ++++ Lib/test/libregrtest/save_env.py | 14 +- Lib/test/libregrtest/setup.py | 139 ++- Lib/test/libregrtest/single.py | 278 ++++++ Lib/test/libregrtest/utils.py | 409 ++++++++- Lib/test/libregrtest/worker.py | 116 +++ Lib/test/pythoninfo.py | 122 ++- Lib/test/regrtest.py | 2 +- Lib/test/support/__init__.py | 73 +- Lib/test/support/os_helper.py | 2 +- Lib/test/support/testresult.py | 3 + Lib/test/support/threading_helper.py | 18 +- Lib/test/test_faulthandler.py | 17 +- Lib/test/test_regrtest.py | 823 +++++++++++++----- Lib/test/test_support.py | 43 +- 29 files changed, 3712 insertions(+), 2176 deletions(-) create mode 100644 Lib/test/libregrtest/findtests.py create mode 100644 Lib/test/libregrtest/logger.py create mode 100644 Lib/test/libregrtest/result.py create mode 100644 Lib/test/libregrtest/results.py create mode 100644 Lib/test/libregrtest/run_workers.py delete mode 100644 Lib/test/libregrtest/runtest.py delete mode 100644 Lib/test/libregrtest/runtest_mp.py create mode 100644 Lib/test/libregrtest/runtests.py create mode 100644 Lib/test/libregrtest/single.py create mode 100644 Lib/test/libregrtest/worker.py diff --git a/Lib/test/__main__.py b/Lib/test/__main__.py index 19a6b2b8904526..82b50ad2c6e777 100644 --- a/Lib/test/__main__.py +++ b/Lib/test/__main__.py @@ -1,2 +1,2 @@ -from test.libregrtest import main -main() +from test.libregrtest.main import main +main(_add_python_opts=True) diff --git a/Lib/test/autotest.py b/Lib/test/autotest.py index fa85cc153a133a..b5a1fab404c72d 100644 --- a/Lib/test/autotest.py +++ b/Lib/test/autotest.py @@ -1,5 +1,5 @@ # This should be equivalent to running regrtest.py from the cmdline. # It can be especially handy if you're in an interactive shell, e.g., # from test import autotest. -from test.libregrtest import main +from test.libregrtest.main import main main() diff --git a/Lib/test/libregrtest/__init__.py b/Lib/test/libregrtest/__init__.py index 5e8dba5dbde71a..e69de29bb2d1d6 100644 --- a/Lib/test/libregrtest/__init__.py +++ b/Lib/test/libregrtest/__init__.py @@ -1,2 +0,0 @@ -from test.libregrtest.cmdline import _parse_args, RESOURCE_NAMES, ALL_RESOURCES -from test.libregrtest.main import main diff --git a/Lib/test/libregrtest/cmdline.py b/Lib/test/libregrtest/cmdline.py index 40ae8fc46e9b01..dd4cd335bef7e3 100644 --- a/Lib/test/libregrtest/cmdline.py +++ b/Lib/test/libregrtest/cmdline.py @@ -1,8 +1,9 @@ import argparse -import os +import os.path import shlex import sys from test.support import os_helper +from .utils import ALL_RESOURCES, RESOURCE_NAMES USAGE = """\ @@ -27,8 +28,10 @@ Additional option details: -r randomizes test execution order. You can use --randseed=int to provide an -int seed value for the randomizer; this is useful for reproducing troublesome -test orders. +int seed value for the randomizer. The randseed value will be used +to set seeds for all random usages in tests +(including randomizing the tests order if -r is set). +By default we always set random seed, but do not randomize test order. -s On the first invocation of regrtest using -s, the first test file found or the first test file given on the command line is run, and the name of @@ -130,25 +133,17 @@ """ -ALL_RESOURCES = ('audio', 'curses', 'largefile', 'network', - 'decimal', 'cpu', 'subprocess', 'urlfetch', 'gui', 'walltime') - -# Other resources excluded from --use=all: -# -# - extralagefile (ex: test_zipfile64): really too slow to be enabled -# "by default" -# - tzdata: while needed to validate fully test_datetime, it makes -# test_datetime too slow (15-20 min on some buildbots) and so is disabled by -# default (see bpo-30822). -RESOURCE_NAMES = ALL_RESOURCES + ('extralargefile', 'tzdata') - - class Namespace(argparse.Namespace): def __init__(self, **kwargs) -> None: + self.ci = False self.testdir = None self.verbose = 0 self.quiet = False self.exclude = False + self.cleanup = False + self.wait = False + self.list_cases = False + self.list_tests = False self.single = False self.randomize = False self.fromfile = None @@ -157,8 +152,8 @@ def __init__(self, **kwargs) -> None: self.trace = False self.coverdir = 'coverage' self.runleaks = False - self.huntrleaks = False - self.verbose2 = False + self.huntrleaks: tuple[int, int, str] | None = None + self.rerun = False self.verbose3 = False self.print_slow = False self.random_seed = None @@ -170,6 +165,14 @@ def __init__(self, **kwargs) -> None: self.ignore_tests = None self.pgo = False self.pgo_extended = False + self.worker_json = None + self.start = None + self.timeout = None + self.memlimit = None + self.threshold = None + self.fail_rerun = False + self.tempdir = None + self._add_python_opts = True super().__init__(**kwargs) @@ -198,25 +201,35 @@ def _create_parser(): # We add help explicitly to control what argument group it renders under. group.add_argument('-h', '--help', action='help', help='show this help message and exit') - group.add_argument('--timeout', metavar='TIMEOUT', type=float, + group.add_argument('--fast-ci', action='store_true', + help='Fast Continuous Integration (CI) mode used by ' + 'GitHub Actions') + group.add_argument('--slow-ci', action='store_true', + help='Slow Continuous Integration (CI) mode used by ' + 'buildbot workers') + group.add_argument('--timeout', metavar='TIMEOUT', help='dump the traceback and exit if a test takes ' 'more than TIMEOUT seconds; disabled if TIMEOUT ' 'is negative or equals to zero') group.add_argument('--wait', action='store_true', help='wait for user input, e.g., allow a debugger ' 'to be attached') - group.add_argument('--worker-args', metavar='ARGS') group.add_argument('-S', '--start', metavar='START', help='the name of the test at which to start.' + more_details) group.add_argument('-p', '--python', metavar='PYTHON', help='Command to run Python test subprocesses with.') + group.add_argument('--randseed', metavar='SEED', + dest='random_seed', type=int, + help='pass a global random seed') group = parser.add_argument_group('Verbosity') group.add_argument('-v', '--verbose', action='count', help='run tests in verbose mode with output to stdout') - group.add_argument('-w', '--verbose2', action='store_true', + group.add_argument('-w', '--rerun', action='store_true', help='re-run failed tests in verbose mode') + group.add_argument('--verbose2', action='store_true', dest='rerun', + help='deprecated alias to --rerun') group.add_argument('-W', '--verbose3', action='store_true', help='display test output on failure') group.add_argument('-q', '--quiet', action='store_true', @@ -229,10 +242,6 @@ def _create_parser(): group = parser.add_argument_group('Selecting tests') group.add_argument('-r', '--randomize', action='store_true', help='randomize test execution order.' + more_details) - group.add_argument('--randseed', metavar='SEED', - dest='random_seed', type=int, - help='pass a random seed to reproduce a previous ' - 'random run') group.add_argument('-f', '--fromfile', metavar='FILE', help='read names of tests to run from a file.' + more_details) @@ -311,6 +320,9 @@ def _create_parser(): group.add_argument('--fail-env-changed', action='store_true', help='if a test file alters the environment, mark ' 'the test as failed') + group.add_argument('--fail-rerun', action='store_true', + help='if a test failed and then passed when re-run, ' + 'mark the tests as failed') group.add_argument('--junit-xml', dest='xmlpath', metavar='FILENAME', help='writes JUnit-style XML results to the specified ' @@ -319,6 +331,9 @@ def _create_parser(): help='override the working directory for the test run') group.add_argument('--cleanup', action='store_true', help='remove old test_python_* directories') + group.add_argument('--dont-add-python-opts', dest='_add_python_opts', + action='store_false', + help="internal option, don't use it") return parser @@ -369,7 +384,50 @@ def _parse_args(args, **kwargs): for arg in ns.args: if arg.startswith('-'): parser.error("unrecognized arguments: %s" % arg) - sys.exit(1) + + if ns.timeout is not None: + # Support "--timeout=" (no value) so Makefile.pre.pre TESTTIMEOUT + # can be used by "make buildbottest" and "make test". + if ns.timeout != "": + try: + ns.timeout = float(ns.timeout) + except ValueError: + parser.error(f"invalid timeout value: {ns.timeout!r}") + else: + ns.timeout = None + + # Continuous Integration (CI): common options for fast/slow CI modes + if ns.slow_ci or ns.fast_ci: + # Similar to options: + # + # -j0 --randomize --fail-env-changed --fail-rerun --rerun + # --slowest --verbose3 + if ns.use_mp is None: + ns.use_mp = 0 + ns.randomize = True + ns.fail_env_changed = True + ns.fail_rerun = True + if ns.python is None: + ns.rerun = True + ns.print_slow = True + ns.verbose3 = True + else: + ns._add_python_opts = False + + # When both --slow-ci and --fast-ci options are present, + # --slow-ci has the priority + if ns.slow_ci: + # Similar to: -u "all" --timeout=1200 + if not ns.use: + ns.use = [['all']] + if ns.timeout is None: + ns.timeout = 1200 # 20 minutes + elif ns.fast_ci: + # Similar to: -u "all,-cpu" --timeout=600 + if not ns.use: + ns.use = [['all', '-cpu']] + if ns.timeout is None: + ns.timeout = 600 # 10 minutes if ns.single and ns.fromfile: parser.error("-s and -f don't go together!") @@ -382,7 +440,7 @@ def _parse_args(args, **kwargs): ns.python = shlex.split(ns.python) if ns.failfast and not (ns.verbose or ns.verbose3): parser.error("-G/--failfast needs either -v or -W") - if ns.pgo and (ns.verbose or ns.verbose2 or ns.verbose3): + if ns.pgo and (ns.verbose or ns.rerun or ns.verbose3): parser.error("--pgo/-v don't go together!") if ns.pgo_extended: ns.pgo = True # pgo_extended implies pgo @@ -396,10 +454,6 @@ def _parse_args(args, **kwargs): if ns.timeout is not None: if ns.timeout <= 0: ns.timeout = None - if ns.use_mp is not None: - if ns.use_mp <= 0: - # Use all cores + extras for tests that like to sleep - ns.use_mp = 2 + (os.cpu_count() or 1) if ns.use: for a in ns.use: for r in a: @@ -443,4 +497,13 @@ def _parse_args(args, **kwargs): # --forever implies --failfast ns.failfast = True + if ns.huntrleaks: + warmup, repetitions, _ = ns.huntrleaks + if warmup < 1 or repetitions < 1: + msg = ("Invalid values for the --huntrleaks/-R parameters. The " + "number of warmups and repetitions must be at least 1 " + "each (1:1).") + print(msg, file=sys.stderr, flush=True) + sys.exit(2) + return ns diff --git a/Lib/test/libregrtest/findtests.py b/Lib/test/libregrtest/findtests.py new file mode 100644 index 00000000000000..96cc3e0d021184 --- /dev/null +++ b/Lib/test/libregrtest/findtests.py @@ -0,0 +1,105 @@ +import os +import sys +import unittest + +from test import support + +from .utils import ( + StrPath, TestName, TestTuple, TestList, FilterTuple, + abs_module_name, count, printlist) + + +# If these test directories are encountered recurse into them and treat each +# "test_*.py" file or each sub-directory as a separate test module. This can +# increase parallelism. +# +# Beware this can't generally be done for any directory with sub-tests as the +# __init__.py may do things which alter what tests are to be run. +SPLITTESTDIRS: set[TestName] = { + "test_asyncio", + "test_concurrent_futures", + "test_future_stmt", + "test_gdb", + "test_multiprocessing_fork", + "test_multiprocessing_forkserver", + "test_multiprocessing_spawn", +} + + +def findtestdir(path: StrPath | None = None) -> StrPath: + return path or os.path.dirname(os.path.dirname(__file__)) or os.curdir + + +def findtests(*, testdir: StrPath | None = None, exclude=(), + split_test_dirs: set[TestName] = SPLITTESTDIRS, + base_mod: str = "") -> TestList: + """Return a list of all applicable test modules.""" + testdir = findtestdir(testdir) + tests = [] + for name in os.listdir(testdir): + mod, ext = os.path.splitext(name) + if (not mod.startswith("test_")) or (mod in exclude): + continue + if base_mod: + fullname = f"{base_mod}.{mod}" + else: + fullname = mod + if fullname in split_test_dirs: + subdir = os.path.join(testdir, mod) + if not base_mod: + fullname = f"test.{mod}" + tests.extend(findtests(testdir=subdir, exclude=exclude, + split_test_dirs=split_test_dirs, + base_mod=fullname)) + elif ext in (".py", ""): + tests.append(fullname) + return sorted(tests) + + +def split_test_packages(tests, *, testdir: StrPath | None = None, exclude=(), + split_test_dirs=SPLITTESTDIRS): + testdir = findtestdir(testdir) + splitted = [] + for name in tests: + if name in split_test_dirs: + subdir = os.path.join(testdir, name) + splitted.extend(findtests(testdir=subdir, exclude=exclude, + split_test_dirs=split_test_dirs, + base_mod=name)) + else: + splitted.append(name) + return splitted + + +def _list_cases(suite): + for test in suite: + if isinstance(test, unittest.loader._FailedTest): + continue + if isinstance(test, unittest.TestSuite): + _list_cases(test) + elif isinstance(test, unittest.TestCase): + if support.match_test(test): + print(test.id()) + +def list_cases(tests: TestTuple, *, + match_tests: FilterTuple | None = None, + ignore_tests: FilterTuple | None = None, + test_dir: StrPath | None = None): + support.verbose = False + support.set_match_tests(match_tests, ignore_tests) + + skipped = [] + for test_name in tests: + module_name = abs_module_name(test_name, test_dir) + try: + suite = unittest.defaultTestLoader.loadTestsFromName(module_name) + _list_cases(suite) + except unittest.SkipTest: + skipped.append(test_name) + + if skipped: + sys.stdout.flush() + stderr = sys.stderr + print(file=stderr) + print(count(len(skipped), "test"), "skipped:", file=stderr) + printlist(skipped, file=stderr) diff --git a/Lib/test/libregrtest/logger.py b/Lib/test/libregrtest/logger.py new file mode 100644 index 00000000000000..a125706927393c --- /dev/null +++ b/Lib/test/libregrtest/logger.py @@ -0,0 +1,86 @@ +import os +import time + +from test.support import MS_WINDOWS +from .results import TestResults +from .runtests import RunTests +from .utils import print_warning + +if MS_WINDOWS: + from .win_utils import WindowsLoadTracker + + +class Logger: + def __init__(self, results: TestResults, quiet: bool, pgo: bool): + self.start_time = time.perf_counter() + self.test_count_text = '' + self.test_count_width = 3 + self.win_load_tracker: WindowsLoadTracker | None = None + self._results: TestResults = results + self._quiet: bool = quiet + self._pgo: bool = pgo + + def log(self, line: str = '') -> None: + empty = not line + + # add the system load prefix: "load avg: 1.80 " + load_avg = self.get_load_avg() + if load_avg is not None: + line = f"load avg: {load_avg:.2f} {line}" + + # add the timestamp prefix: "0:01:05 " + log_time = time.perf_counter() - self.start_time + + mins, secs = divmod(int(log_time), 60) + hours, mins = divmod(mins, 60) + formatted_log_time = "%d:%02d:%02d" % (hours, mins, secs) + + line = f"{formatted_log_time} {line}" + if empty: + line = line[:-1] + + print(line, flush=True) + + def get_load_avg(self) -> float | None: + if hasattr(os, 'getloadavg'): + return os.getloadavg()[0] + if self.win_load_tracker is not None: + return self.win_load_tracker.getloadavg() + return None + + def display_progress(self, test_index: int, text: str) -> None: + if self._quiet: + return + results = self._results + + # "[ 51/405/1] test_tcl passed" + line = f"{test_index:{self.test_count_width}}{self.test_count_text}" + fails = len(results.bad) + len(results.env_changed) + if fails and not self._pgo: + line = f"{line}/{fails}" + self.log(f"[{line}] {text}") + + def set_tests(self, runtests: RunTests) -> None: + if runtests.forever: + self.test_count_text = '' + self.test_count_width = 3 + else: + self.test_count_text = '/{}'.format(len(runtests.tests)) + self.test_count_width = len(self.test_count_text) - 1 + + def start_load_tracker(self) -> None: + if not MS_WINDOWS: + return + + try: + self.win_load_tracker = WindowsLoadTracker() + except PermissionError as error: + # Standard accounts may not have access to the performance + # counters. + print_warning(f'Failed to create WindowsLoadTracker: {error}') + + def stop_load_tracker(self) -> None: + if self.win_load_tracker is None: + return + self.win_load_tracker.close() + self.win_load_tracker = None diff --git a/Lib/test/libregrtest/main.py b/Lib/test/libregrtest/main.py index 7192244c69c55e..fe35df05c80e89 100644 --- a/Lib/test/libregrtest/main.py +++ b/Lib/test/libregrtest/main.py @@ -1,45 +1,30 @@ -import faulthandler -import locale import os -import platform import random import re +import shlex import sys import sysconfig -import tempfile import time -import unittest -from test.libregrtest.cmdline import _parse_args -from test.libregrtest.runtest import ( - findtests, split_test_packages, runtest, get_abs_module, - PROGRESS_MIN_TIME, State) -from test.libregrtest.setup import setup_tests -from test.libregrtest.pgo import setup_pgo_tests -from test.libregrtest.utils import (removepy, count, format_duration, - printlist, get_build_info) -from test import support -from test.support import TestStats -from test.support import os_helper -from test.support import threading_helper - - -# bpo-38203: Maximum delay in seconds to exit Python (call Py_Finalize()). -# Used to protect against threading._shutdown() hang. -# Must be smaller than buildbot "1200 seconds without output" limit. -EXIT_TIMEOUT = 120.0 -# gh-90681: When rerunning tests, we might need to rerun the whole -# class or module suite if some its life-cycle hooks fail. -# Test level hooks are not affected. -_TEST_LIFECYCLE_HOOKS = frozenset(( - 'setUpClass', 'tearDownClass', - 'setUpModule', 'tearDownModule', -)) - -EXITCODE_BAD_TEST = 2 -EXITCODE_INTERRUPTED = 130 -EXITCODE_ENV_CHANGED = 3 -EXITCODE_NO_TESTS_RAN = 4 +from test import support +from test.support import os_helper, MS_WINDOWS + +from .cmdline import _parse_args, Namespace +from .findtests import findtests, split_test_packages, list_cases +from .logger import Logger +from .pgo import setup_pgo_tests +from .result import State +from .results import TestResults, EXITCODE_INTERRUPTED +from .runtests import RunTests, HuntRefleak +from .setup import setup_process, setup_test_dir +from .single import run_single_test, PROGRESS_MIN_TIME +from .utils import ( + StrPath, StrJSON, TestName, TestList, TestTuple, FilterTuple, + strip_py_suffix, count, format_duration, + printlist, get_temp_dir, get_work_dir, exit_timeout, + display_header, cleanup_temp_dir, print_warning, + is_cross_compiled, get_host_runner, process_cpu_count, + EXIT_TIMEOUT) class Regrtest: @@ -65,266 +50,212 @@ class Regrtest: directly to set the values that would normally be set by flags on the command line. """ - def __init__(self): - # Namespace of command line options - self.ns = None + def __init__(self, ns: Namespace, _add_python_opts: bool = False): + # Log verbosity + self.verbose: int = int(ns.verbose) + self.quiet: bool = ns.quiet + self.pgo: bool = ns.pgo + self.pgo_extended: bool = ns.pgo_extended + + # Test results + self.results: TestResults = TestResults() + self.first_state: str | None = None + + # Logger + self.logger = Logger(self.results, self.quiet, self.pgo) + + # Actions + self.want_header: bool = ns.header + self.want_list_tests: bool = ns.list_tests + self.want_list_cases: bool = ns.list_cases + self.want_wait: bool = ns.wait + self.want_cleanup: bool = ns.cleanup + self.want_rerun: bool = ns.rerun + self.want_run_leaks: bool = ns.runleaks + + self.ci_mode: bool = (ns.fast_ci or ns.slow_ci) + self.want_add_python_opts: bool = (_add_python_opts + and ns._add_python_opts) + + # Select tests + if ns.match_tests: + self.match_tests: FilterTuple | None = tuple(ns.match_tests) + else: + self.match_tests = None + if ns.ignore_tests: + self.ignore_tests: FilterTuple | None = tuple(ns.ignore_tests) + else: + self.ignore_tests = None + self.exclude: bool = ns.exclude + self.fromfile: StrPath | None = ns.fromfile + self.starting_test: TestName | None = ns.start + self.cmdline_args: TestList = ns.args + + # Workers + if ns.use_mp is None: + num_workers = 0 # run sequentially + elif ns.use_mp <= 0: + num_workers = -1 # use the number of CPUs + else: + num_workers = ns.use_mp + self.num_workers: int = num_workers + self.worker_json: StrJSON | None = ns.worker_json + + # Options to run tests + self.fail_fast: bool = ns.failfast + self.fail_env_changed: bool = ns.fail_env_changed + self.fail_rerun: bool = ns.fail_rerun + self.forever: bool = ns.forever + self.output_on_failure: bool = ns.verbose3 + self.timeout: float | None = ns.timeout + if ns.huntrleaks: + warmups, runs, filename = ns.huntrleaks + filename = os.path.abspath(filename) + self.hunt_refleak: HuntRefleak | None = HuntRefleak(warmups, runs, filename) + else: + self.hunt_refleak = None + self.test_dir: StrPath | None = ns.testdir + self.junit_filename: StrPath | None = ns.xmlpath + self.memory_limit: str | None = ns.memlimit + self.gc_threshold: int | None = ns.threshold + self.use_resources: tuple[str, ...] = tuple(ns.use_resources) + if ns.python: + self.python_cmd: tuple[str, ...] | None = tuple(ns.python) + else: + self.python_cmd = None + self.coverage: bool = ns.trace + self.coverage_dir: StrPath | None = ns.coverdir + self.tmp_dir: StrPath | None = ns.tempdir + + # Randomize + self.randomize: bool = ns.randomize + self.random_seed: int | None = ( + ns.random_seed + if ns.random_seed is not None + else random.getrandbits(32) + ) + if 'SOURCE_DATE_EPOCH' in os.environ: + self.randomize = False + self.random_seed = None # tests - self.tests = [] - self.selected = [] - - # test results - self.good = [] - self.bad = [] - self.skipped = [] - self.resource_denied = [] - self.environment_changed = [] - self.run_no_tests = [] - self.need_rerun = [] - self.rerun = [] - self.first_result = None - self.interrupted = False - self.stats_dict: dict[str, TestStats] = {} - - # used by --slow - self.test_times = [] - - # used by --coverage, trace.Trace instance - self.tracer = None + self.first_runtests: RunTests | None = None + + # used by --slowest + self.print_slowest: bool = ns.print_slow # used to display the progress bar "[ 3/100]" self.start_time = time.perf_counter() - self.test_count = '' - self.test_count_width = 1 # used by --single - self.next_single_test = None - self.next_single_filename = None - - # used by --junit-xml - self.testsuite_xml = None - - # misc - self.win_load_tracker = None - self.tmp_dir = None - self.worker_test_name = None - - def get_executed(self): - return (set(self.good) | set(self.bad) | set(self.skipped) - | set(self.resource_denied) | set(self.environment_changed) - | set(self.run_no_tests)) - - def accumulate_result(self, result, rerun=False): - test_name = result.test_name - - if result.has_meaningful_duration() and not rerun: - self.test_times.append((result.duration, test_name)) - - match result.state: - case State.PASSED: - self.good.append(test_name) - case State.ENV_CHANGED: - self.environment_changed.append(test_name) - case State.SKIPPED: - self.skipped.append(test_name) - case State.RESOURCE_DENIED: - self.skipped.append(test_name) - self.resource_denied.append(test_name) - case State.INTERRUPTED: - self.interrupted = True - case State.DID_NOT_RUN: - self.run_no_tests.append(test_name) - case _: - if result.is_failed(self.ns.fail_env_changed): - if not rerun: - self.bad.append(test_name) - self.need_rerun.append(result) - else: - raise ValueError(f"invalid test state: {state!r}") - - if result.stats is not None: - self.stats_dict[result.test_name] = result.stats - - if rerun and not(result.is_failed(False) or result.state == State.INTERRUPTED): - self.bad.remove(test_name) - - xml_data = result.xml_data - if xml_data: - import xml.etree.ElementTree as ET - for e in xml_data: - try: - self.testsuite_xml.append(ET.fromstring(e)) - except ET.ParseError: - print(xml_data, file=sys.__stderr__) - raise + self.single_test_run: bool = ns.single + self.next_single_test: TestName | None = None + self.next_single_filename: StrPath | None = None def log(self, line=''): - empty = not line - - # add the system load prefix: "load avg: 1.80 " - load_avg = self.getloadavg() - if load_avg is not None: - line = f"load avg: {load_avg:.2f} {line}" - - # add the timestamp prefix: "0:01:05 " - test_time = time.perf_counter() - self.start_time - - mins, secs = divmod(int(test_time), 60) - hours, mins = divmod(mins, 60) - test_time = "%d:%02d:%02d" % (hours, mins, secs) - - line = f"{test_time} {line}" - if empty: - line = line[:-1] - - print(line, flush=True) - - def display_progress(self, test_index, text): - if self.ns.quiet: - return - - # "[ 51/405/1] test_tcl passed" - line = f"{test_index:{self.test_count_width}}{self.test_count}" - fails = len(self.bad) + len(self.environment_changed) - if fails and not self.ns.pgo: - line = f"{line}/{fails}" - self.log(f"[{line}] {text}") - - def parse_args(self, kwargs): - ns = _parse_args(sys.argv[1:], **kwargs) - - if ns.xmlpath: - support.junit_xml_list = self.testsuite_xml = [] - - worker_args = ns.worker_args - if worker_args is not None: - from test.libregrtest.runtest_mp import parse_worker_args - ns, test_name = parse_worker_args(ns.worker_args) - ns.worker_args = worker_args - self.worker_test_name = test_name - - # Strip .py extensions. - removepy(ns.args) + self.logger.log(line) - if ns.huntrleaks: - warmup, repetitions, _ = ns.huntrleaks - if warmup < 1 or repetitions < 1: - msg = ("Invalid values for the --huntrleaks/-R parameters. The " - "number of warmups and repetitions must be at least 1 " - "each (1:1).") - print(msg, file=sys.stderr, flush=True) - sys.exit(2) - - if ns.tempdir: - ns.tempdir = os.path.expanduser(ns.tempdir) - - self.ns = ns - - def find_tests(self, tests): - self.tests = tests - - if self.ns.single: + def find_tests(self, tests: TestList | None = None) -> tuple[TestTuple, TestList | None]: + if self.single_test_run: self.next_single_filename = os.path.join(self.tmp_dir, 'pynexttest') try: with open(self.next_single_filename, 'r') as fp: next_test = fp.read().strip() - self.tests = [next_test] + tests = [next_test] except OSError: pass - if self.ns.fromfile: - self.tests = [] + if self.fromfile: + tests = [] # regex to match 'test_builtin' in line: # '0:00:00 [ 4/400] test_builtin -- test_dict took 1 sec' regex = re.compile(r'\btest_[a-zA-Z0-9_]+\b') - with open(os.path.join(os_helper.SAVEDCWD, self.ns.fromfile)) as fp: + with open(os.path.join(os_helper.SAVEDCWD, self.fromfile)) as fp: for line in fp: line = line.split('#', 1)[0] line = line.strip() match = regex.search(line) if match is not None: - self.tests.append(match.group()) + tests.append(match.group()) - removepy(self.tests) + strip_py_suffix(tests) - if self.ns.pgo: + if self.pgo: # add default PGO tests if no tests are specified - setup_pgo_tests(self.ns) + setup_pgo_tests(self.cmdline_args, self.pgo_extended) - exclude = set() - if self.ns.exclude: - for arg in self.ns.args: - exclude.add(arg) - self.ns.args = [] + exclude_tests = set() + if self.exclude: + for arg in self.cmdline_args: + exclude_tests.add(arg) + self.cmdline_args = [] - alltests = findtests(testdir=self.ns.testdir, exclude=exclude) + alltests = findtests(testdir=self.test_dir, + exclude=exclude_tests) - if not self.ns.fromfile: - self.selected = self.tests or self.ns.args - if self.selected: - self.selected = split_test_packages(self.selected) + if not self.fromfile: + selected = tests or self.cmdline_args + if selected: + selected = split_test_packages(selected) else: - self.selected = alltests + selected = alltests else: - self.selected = self.tests + selected = tests - if self.ns.single: - self.selected = self.selected[:1] + if self.single_test_run: + selected = selected[:1] try: - pos = alltests.index(self.selected[0]) + pos = alltests.index(selected[0]) self.next_single_test = alltests[pos + 1] except IndexError: pass # Remove all the selected tests that precede start if it's set. - if self.ns.start: + if self.starting_test: try: - del self.selected[:self.selected.index(self.ns.start)] + del selected[:selected.index(self.starting_test)] except ValueError: - print("Couldn't find starting test (%s), using all tests" - % self.ns.start, file=sys.stderr) - - if self.ns.randomize: - if self.ns.random_seed is None: - self.ns.random_seed = random.randrange(10000000) - random.seed(self.ns.random_seed) - random.shuffle(self.selected) + print(f"Cannot find starting test: {self.starting_test}") + sys.exit(1) - def list_tests(self): - for name in self.selected: - print(name) - - def _list_cases(self, suite): - for test in suite: - if isinstance(test, unittest.loader._FailedTest): - continue - if isinstance(test, unittest.TestSuite): - self._list_cases(test) - elif isinstance(test, unittest.TestCase): - if support.match_test(test): - print(test.id()) - - def list_cases(self): - support.verbose = False - support.set_match_tests(self.ns.match_tests, self.ns.ignore_tests) - - for test_name in self.selected: - abstest = get_abs_module(self.ns, test_name) - try: - suite = unittest.defaultTestLoader.loadTestsFromName(abstest) - self._list_cases(suite) - except unittest.SkipTest: - self.skipped.append(test_name) + random.seed(self.random_seed) + if self.randomize: + random.shuffle(selected) - if self.skipped: - print(file=sys.stderr) - print(count(len(self.skipped), "test"), "skipped:", file=sys.stderr) - printlist(self.skipped, file=sys.stderr) + return (tuple(selected), tests) - def rerun_failed_tests(self): - self.log() + @staticmethod + def list_tests(tests: TestTuple): + for name in tests: + print(name) - if self.ns.python: + def _rerun_failed_tests(self, runtests: RunTests): + # Configure the runner to re-run tests + if self.num_workers == 0: + # Always run tests in fresh processes to have more deterministic + # initial state. Don't re-run tests in parallel but limit to a + # single worker process to have side effects (on the system load + # and timings) between tests. + self.num_workers = 1 + + tests, match_tests_dict = self.results.prepare_rerun() + + # Re-run failed tests + self.log(f"Re-running {len(tests)} failed tests in verbose mode in subprocesses") + runtests = runtests.copy( + tests=tests, + rerun=True, + verbose=True, + forever=False, + fail_fast=False, + match_tests_dict=match_tests_dict, + output_on_failure=False) + self.logger.set_tests(runtests) + self._run_tests_mp(runtests, self.num_workers) + return runtests + + def rerun_failed_tests(self, runtests: RunTests): + if self.python_cmd: # Temp patch for https://github.com/python/cpython/issues/94052 self.log( "Re-running failed tests is not supported with --python " @@ -332,160 +263,81 @@ def rerun_failed_tests(self): ) return - self.ns.verbose = True - self.ns.failfast = False - self.ns.verbose3 = False - - self.first_result = self.get_tests_result() - - self.log("Re-running failed tests in verbose mode") - rerun_list = list(self.need_rerun) - self.need_rerun.clear() - for result in rerun_list: - test_name = result.test_name - self.rerun.append(test_name) - - errors = result.errors or [] - failures = result.failures or [] - error_names = [ - self.normalize_test_name(test_full_name, is_error=True) - for (test_full_name, *_) in errors] - failure_names = [ - self.normalize_test_name(test_full_name) - for (test_full_name, *_) in failures] - self.ns.verbose = True - orig_match_tests = self.ns.match_tests - if errors or failures: - if self.ns.match_tests is None: - self.ns.match_tests = [] - self.ns.match_tests.extend(error_names) - self.ns.match_tests.extend(failure_names) - matching = "matching: " + ", ".join(self.ns.match_tests) - self.log(f"Re-running {test_name} in verbose mode ({matching})") - else: - self.log(f"Re-running {test_name} in verbose mode") - result = runtest(self.ns, test_name) - self.ns.match_tests = orig_match_tests + self.first_state = self.get_state() - self.accumulate_result(result, rerun=True) + print() + rerun_runtests = self._rerun_failed_tests(runtests) - if result.state == State.INTERRUPTED: - break + if self.results.bad: + print(count(len(self.results.bad), 'test'), "failed again:") + printlist(self.results.bad) + + self.display_result(rerun_runtests) - if self.bad: - print(count(len(self.bad), 'test'), "failed again:") - printlist(self.bad) - - self.display_result() - - def normalize_test_name(self, test_full_name, *, is_error=False): - short_name = test_full_name.split(" ")[0] - if is_error and short_name in _TEST_LIFECYCLE_HOOKS: - # This means that we have a failure in a life-cycle hook, - # we need to rerun the whole module or class suite. - # Basically the error looks like this: - # ERROR: setUpClass (test.test_reg_ex.RegTest) - # or - # ERROR: setUpModule (test.test_reg_ex) - # So, we need to parse the class / module name. - lpar = test_full_name.index('(') - rpar = test_full_name.index(')') - return test_full_name[lpar + 1: rpar].split('.')[-1] - return short_name - - def display_result(self): + def display_result(self, runtests): # If running the test suite for PGO then no one cares about results. - if self.ns.pgo: + if runtests.pgo: return + state = self.get_state() print() - print("== Tests result: %s ==" % self.get_tests_result()) - - if self.interrupted: - print("Test suite interrupted by signal SIGINT.") - - omitted = set(self.selected) - self.get_executed() - if omitted: - print() - print(count(len(omitted), "test"), "omitted:") - printlist(omitted) - - if self.good and not self.ns.quiet: - print() - if (not self.bad - and not self.skipped - and not self.interrupted - and len(self.good) > 1): - print("All", end=' ') - print(count(len(self.good), "test"), "OK.") - - if self.ns.print_slow: - self.test_times.sort(reverse=True) - print() - print("10 slowest tests:") - for test_time, test in self.test_times[:10]: - print("- %s: %s" % (test, format_duration(test_time))) - - if self.bad: - print() - print(count(len(self.bad), "test"), "failed:") - printlist(self.bad) - - if self.environment_changed: - print() - print("{} altered the execution environment:".format( - count(len(self.environment_changed), "test"))) - printlist(self.environment_changed) - - if self.skipped and not self.ns.quiet: - print() - print(count(len(self.skipped), "test"), "skipped:") - printlist(self.skipped) - - if self.rerun: - print() - print("%s:" % count(len(self.rerun), "re-run test")) - printlist(self.rerun) - - if self.run_no_tests: - print() - print(count(len(self.run_no_tests), "test"), "run no tests:") - printlist(self.run_no_tests) - - def run_tests_sequential(self): - if self.ns.trace: + print(f"== Tests result: {state} ==") + + self.results.display_result(runtests.tests, + self.quiet, self.print_slowest) + + def run_test(self, test_name: TestName, runtests: RunTests, tracer): + if tracer is not None: + # If we're tracing code coverage, then we don't exit with status + # if on a false return value from main. + cmd = ('result = run_single_test(test_name, runtests)') + namespace = dict(locals()) + tracer.runctx(cmd, globals=globals(), locals=namespace) + result = namespace['result'] + else: + result = run_single_test(test_name, runtests) + + self.results.accumulate_result(result, runtests) + + return result + + def run_tests_sequentially(self, runtests): + if self.coverage: import trace - self.tracer = trace.Trace(trace=False, count=True) + tracer = trace.Trace(trace=False, count=True) + else: + tracer = None save_modules = sys.modules.keys() - msg = "Run tests sequentially" - if self.ns.timeout: - msg += " (timeout: %s)" % format_duration(self.ns.timeout) + jobs = runtests.get_jobs() + if jobs is not None: + tests = count(jobs, 'test') + else: + tests = 'tests' + msg = f"Run {tests} sequentially" + if runtests.timeout: + msg += " (timeout: %s)" % format_duration(runtests.timeout) self.log(msg) previous_test = None - for test_index, test_name in enumerate(self.tests, 1): + tests_iter = runtests.iter_tests() + for test_index, test_name in enumerate(tests_iter, 1): start_time = time.perf_counter() text = test_name if previous_test: text = '%s -- %s' % (text, previous_test) - self.display_progress(test_index, text) - - if self.tracer: - # If we're tracing code coverage, then we don't exit with status - # if on a false return value from main. - cmd = ('result = runtest(self.ns, test_name); ' - 'self.accumulate_result(result)') - ns = dict(locals()) - self.tracer.runctx(cmd, globals=globals(), locals=ns) - result = ns['result'] - else: - result = runtest(self.ns, test_name) - self.accumulate_result(result) + self.logger.display_progress(test_index, text) + + result = self.run_test(test_name, runtests, tracer) - if result.state == State.INTERRUPTED: + # Unload the newly imported modules (best effort finalization) + for module in sys.modules.keys(): + if module not in save_modules and module.startswith("test."): + support.unload(module) + + if result.must_stop(self.fail_fast, self.fail_env_changed): break previous_test = str(result) @@ -496,140 +348,22 @@ def run_tests_sequential(self): # be quiet: say nothing if the test passed shortly previous_test = None - # Unload the newly imported modules (best effort finalization) - for module in sys.modules.keys(): - if module not in save_modules and module.startswith("test."): - support.unload(module) - - if self.ns.failfast and result.is_failed(self.ns.fail_env_changed): - break - if previous_test: print(previous_test) - def _test_forever(self, tests): - while True: - for test_name in tests: - yield test_name - if self.bad: - return - if self.ns.fail_env_changed and self.environment_changed: - return - - def display_header(self): - # Print basic platform information - print("==", platform.python_implementation(), *sys.version.split()) - print("==", platform.platform(aliased=True), - "%s-endian" % sys.byteorder) - print("== Python build:", ' '.join(get_build_info())) - print("== cwd:", os.getcwd()) - cpu_count = os.cpu_count() - if cpu_count: - print("== CPU count:", cpu_count) - print("== encodings: locale=%s, FS=%s" - % (locale.getencoding(), sys.getfilesystemencoding())) - self.display_sanitizers() - - def display_sanitizers(self): - # This makes it easier to remember what to set in your local - # environment when trying to reproduce a sanitizer failure. - asan = support.check_sanitizer(address=True) - msan = support.check_sanitizer(memory=True) - ubsan = support.check_sanitizer(ub=True) - sanitizers = [] - if asan: - sanitizers.append("address") - if msan: - sanitizers.append("memory") - if ubsan: - sanitizers.append("undefined behavior") - if not sanitizers: - return - - print(f"== sanitizers: {', '.join(sanitizers)}") - for sanitizer, env_var in ( - (asan, "ASAN_OPTIONS"), - (msan, "MSAN_OPTIONS"), - (ubsan, "UBSAN_OPTIONS"), - ): - options= os.environ.get(env_var) - if sanitizer and options is not None: - print(f"== {env_var}={options!r}") - - def no_tests_run(self): - return not any((self.good, self.bad, self.skipped, self.interrupted, - self.environment_changed)) - - def get_tests_result(self): - result = [] - if self.bad: - result.append("FAILURE") - elif self.ns.fail_env_changed and self.environment_changed: - result.append("ENV CHANGED") - elif self.no_tests_run(): - result.append("NO TESTS RAN") - - if self.interrupted: - result.append("INTERRUPTED") - - if not result: - result.append("SUCCESS") - - result = ', '.join(result) - if self.first_result: - result = '%s then %s' % (self.first_result, result) - return result + return tracer - def run_tests(self): - # For a partial run, we do not need to clutter the output. - if (self.ns.header - or not(self.ns.pgo or self.ns.quiet or self.ns.single - or self.tests or self.ns.args)): - self.display_header() - - if self.ns.huntrleaks: - warmup, repetitions, _ = self.ns.huntrleaks - if warmup < 3: - msg = ("WARNING: Running tests with --huntrleaks/-R and less than " - "3 warmup repetitions can give false positives!") - print(msg, file=sys.stdout, flush=True) - - if self.ns.randomize: - print("Using random seed", self.ns.random_seed) - - if self.ns.forever: - self.tests = self._test_forever(list(self.selected)) - self.test_count = '' - self.test_count_width = 3 - else: - self.tests = iter(self.selected) - self.test_count = '/{}'.format(len(self.selected)) - self.test_count_width = len(self.test_count) - 1 - - if self.ns.use_mp: - from test.libregrtest.runtest_mp import run_tests_multiprocess - # If we're on windows and this is the parent runner (not a worker), - # track the load average. - if sys.platform == 'win32' and self.worker_test_name is None: - from test.libregrtest.win_utils import WindowsLoadTracker - - try: - self.win_load_tracker = WindowsLoadTracker() - except PermissionError as error: - # Standard accounts may not have access to the performance - # counters. - print(f'Failed to create WindowsLoadTracker: {error}') + def get_state(self): + state = self.results.get_state(self.fail_env_changed) + if self.first_state: + state = f'{self.first_state} then {state}' + return state - try: - run_tests_multiprocess(self) - finally: - if self.win_load_tracker is not None: - self.win_load_tracker.close() - self.win_load_tracker = None - else: - self.run_tests_sequential() + def _run_tests_mp(self, runtests: RunTests, num_workers: int) -> None: + from .run_workers import RunWorkers + RunWorkers(num_workers, runtests, self.logger, self.results).run() - def finalize(self): + def finalize_tests(self, tracer): if self.next_single_filename: if self.next_single_test: with open(self.next_single_filename, 'w') as fp: @@ -637,232 +371,299 @@ def finalize(self): else: os.unlink(self.next_single_filename) - if self.tracer: - r = self.tracer.results() - r.write_results(show_missing=True, summary=True, - coverdir=self.ns.coverdir) - - print() - self.display_summary() + if tracer is not None: + results = tracer.results() + results.write_results(show_missing=True, summary=True, + coverdir=self.coverage_dir) - if self.ns.runleaks: + if self.want_run_leaks: os.system("leaks %d" % os.getpid()) + if self.junit_filename: + self.results.write_junit(self.junit_filename) + def display_summary(self): - duration = time.perf_counter() - self.start_time + duration = time.perf_counter() - self.logger.start_time + filtered = bool(self.match_tests) or bool(self.ignore_tests) # Total duration + print() print("Total duration: %s" % format_duration(duration)) - # Total tests - total = TestStats() - for stats in self.stats_dict.values(): - total.accumulate(stats) - stats = [f'run={total.tests_run:,}'] - if total.failures: - stats.append(f'failures={total.failures:,}') - if total.skipped: - stats.append(f'skipped={total.skipped:,}') - print(f"Total tests: {' '.join(stats)}") - - # Total test files - report = [f'success={len(self.good)}'] - if self.bad: - report.append(f'failed={len(self.bad)}') - if self.environment_changed: - report.append(f'env_changed={len(self.environment_changed)}') - if self.skipped: - report.append(f'skipped={len(self.skipped)}') - if self.resource_denied: - report.append(f'resource_denied={len(self.resource_denied)}') - if self.rerun: - report.append(f'rerun={len(self.rerun)}') - if self.run_no_tests: - report.append(f'run_no_tests={len(self.run_no_tests)}') - print(f"Total test files: {' '.join(report)}") + self.results.display_summary(self.first_runtests, filtered) # Result - result = self.get_tests_result() - print(f"Result: {result}") + state = self.get_state() + print(f"Result: {state}") + + def create_run_tests(self, tests: TestTuple): + return RunTests( + tests, + fail_fast=self.fail_fast, + fail_env_changed=self.fail_env_changed, + match_tests=self.match_tests, + ignore_tests=self.ignore_tests, + match_tests_dict=None, + rerun=False, + forever=self.forever, + pgo=self.pgo, + pgo_extended=self.pgo_extended, + output_on_failure=self.output_on_failure, + timeout=self.timeout, + verbose=self.verbose, + quiet=self.quiet, + hunt_refleak=self.hunt_refleak, + test_dir=self.test_dir, + use_junit=(self.junit_filename is not None), + memory_limit=self.memory_limit, + gc_threshold=self.gc_threshold, + use_resources=self.use_resources, + python_cmd=self.python_cmd, + randomize=self.randomize, + random_seed=self.random_seed, + json_file=None, + ) + + def _run_tests(self, selected: TestTuple, tests: TestList | None) -> int: + if self.hunt_refleak and self.hunt_refleak.warmups < 3: + msg = ("WARNING: Running tests with --huntrleaks/-R and " + "less than 3 warmup repetitions can give false positives!") + print(msg, file=sys.stdout, flush=True) + + if self.num_workers < 0: + # Use all CPUs + 2 extra worker processes for tests + # that like to sleep + self.num_workers = (process_cpu_count() or 1) + 2 - def save_xml_result(self): - if not self.ns.xmlpath and not self.testsuite_xml: - return + # For a partial run, we do not need to clutter the output. + if (self.want_header + or not(self.pgo or self.quiet or self.single_test_run + or tests or self.cmdline_args)): + display_header(self.use_resources, self.python_cmd) - import xml.etree.ElementTree as ET - root = ET.Element("testsuites") - - # Manually count the totals for the overall summary - totals = {'tests': 0, 'errors': 0, 'failures': 0} - for suite in self.testsuite_xml: - root.append(suite) - for k in totals: - try: - totals[k] += int(suite.get(k, 0)) - except ValueError: - pass - - for k, v in totals.items(): - root.set(k, str(v)) - - xmlpath = os.path.join(os_helper.SAVEDCWD, self.ns.xmlpath) - with open(xmlpath, 'wb') as f: - for s in ET.tostringlist(root): - f.write(s) - - def fix_umask(self): - if support.is_emscripten: - # Emscripten has default umask 0o777, which breaks some tests. - # see https://github.com/emscripten-core/emscripten/issues/17269 - old_mask = os.umask(0) - if old_mask == 0o777: - os.umask(0o027) - else: - os.umask(old_mask) - - def set_temp_dir(self): - if self.ns.tempdir: - self.tmp_dir = self.ns.tempdir - - if not self.tmp_dir: - # When tests are run from the Python build directory, it is best practice - # to keep the test files in a subfolder. This eases the cleanup of leftover - # files using the "make distclean" command. - if sysconfig.is_python_build(): - self.tmp_dir = sysconfig.get_config_var('abs_builddir') - if self.tmp_dir is None: - self.tmp_dir = sysconfig.get_config_var('abs_srcdir') - if not self.tmp_dir: - # gh-74470: On Windows, only srcdir is available. Using - # abs_builddir mostly matters on UNIX when building - # Python out of the source tree, especially when the - # source tree is read only. - self.tmp_dir = sysconfig.get_config_var('srcdir') - self.tmp_dir = os.path.join(self.tmp_dir, 'build') - else: - self.tmp_dir = tempfile.gettempdir() + print("Using random seed", self.random_seed) - self.tmp_dir = os.path.abspath(self.tmp_dir) + runtests = self.create_run_tests(selected) + self.first_runtests = runtests + self.logger.set_tests(runtests) - def create_temp_dir(self): - os.makedirs(self.tmp_dir, exist_ok=True) + setup_process() - # Define a writable temp dir that will be used as cwd while running - # the tests. The name of the dir includes the pid to allow parallel - # testing (see the -j option). - # Emscripten and WASI have stubbed getpid(), Emscripten has only - # milisecond clock resolution. Use randint() instead. - if sys.platform in {"emscripten", "wasi"}: - nounce = random.randint(0, 1_000_000) - else: - nounce = os.getpid() - if self.worker_test_name is not None: - test_cwd = 'test_python_worker_{}'.format(nounce) + if self.hunt_refleak and not self.num_workers: + # gh-109739: WindowsLoadTracker thread interfers with refleak check + use_load_tracker = False else: - test_cwd = 'test_python_{}'.format(nounce) - test_cwd += os_helper.FS_NONASCII - test_cwd = os.path.join(self.tmp_dir, test_cwd) - return test_cwd - - def cleanup(self): - import glob - - path = os.path.join(glob.escape(self.tmp_dir), 'test_python_*') - print("Cleanup %s directory" % self.tmp_dir) - for name in glob.glob(path): - if os.path.isdir(name): - print("Remove directory: %s" % name) - os_helper.rmtree(name) + # WindowsLoadTracker is only needed on Windows + use_load_tracker = MS_WINDOWS + + if use_load_tracker: + self.logger.start_load_tracker() + try: + if self.num_workers: + self._run_tests_mp(runtests, self.num_workers) + tracer = None else: - print("Remove file: %s" % name) - os_helper.unlink(name) + tracer = self.run_tests_sequentially(runtests) - def main(self, tests=None, **kwargs): - self.parse_args(kwargs) + self.display_result(runtests) - self.set_temp_dir() + if self.want_rerun and self.results.need_rerun(): + self.rerun_failed_tests(runtests) + finally: + if use_load_tracker: + self.logger.stop_load_tracker() - self.fix_umask() + self.display_summary() + self.finalize_tests(tracer) - if self.ns.cleanup: - self.cleanup() - sys.exit(0) + return self.results.get_exitcode(self.fail_env_changed, + self.fail_rerun) - test_cwd = self.create_temp_dir() + def run_tests(self, selected: TestTuple, tests: TestList | None) -> int: + os.makedirs(self.tmp_dir, exist_ok=True) + work_dir = get_work_dir(self.tmp_dir) - try: - # Run the tests in a context manager that temporarily changes the CWD - # to a temporary and writable directory. If it's not possible to - # create or change the CWD, the original CWD will be used. + # Put a timeout on Python exit + with exit_timeout(): + # Run the tests in a context manager that temporarily changes the + # CWD to a temporary and writable directory. If it's not possible + # to create or change the CWD, the original CWD will be used. # The original CWD is available from os_helper.SAVEDCWD. - with os_helper.temp_cwd(test_cwd, quiet=True): - # When using multiprocessing, worker processes will use test_cwd - # as their parent temporary directory. So when the main process - # exit, it removes also subdirectories of worker processes. - self.ns.tempdir = test_cwd - - self._main(tests, kwargs) - except SystemExit as exc: - # bpo-38203: Python can hang at exit in Py_Finalize(), especially - # on threading._shutdown() call: put a timeout - if threading_helper.can_start_thread: - faulthandler.dump_traceback_later(EXIT_TIMEOUT, exit=True) + with os_helper.temp_cwd(work_dir, quiet=True): + # When using multiprocessing, worker processes will use + # work_dir as their parent temporary directory. So when the + # main process exit, it removes also subdirectories of worker + # processes. + return self._run_tests(selected, tests) + + def _add_cross_compile_opts(self, regrtest_opts): + # WASM/WASI buildbot builders pass multiple PYTHON environment + # variables such as PYTHONPATH and _PYTHON_HOSTRUNNER. + keep_environ = bool(self.python_cmd) + environ = None + + # Are we using cross-compilation? + cross_compile = is_cross_compiled() + + # Get HOSTRUNNER + hostrunner = get_host_runner() + + if cross_compile: + # emulate -E, but keep PYTHONPATH + cross compile env vars, + # so test executable can load correct sysconfigdata file. + keep = { + '_PYTHON_PROJECT_BASE', + '_PYTHON_HOST_PLATFORM', + '_PYTHON_SYSCONFIGDATA_NAME', + 'PYTHONPATH' + } + old_environ = os.environ + new_environ = { + name: value for name, value in os.environ.items() + if not name.startswith(('PYTHON', '_PYTHON')) or name in keep + } + # Only set environ if at least one variable was removed + if new_environ != old_environ: + environ = new_environ + keep_environ = True + + if cross_compile and hostrunner: + if self.num_workers == 0: + # For now use only two cores for cross-compiled builds; + # hostrunner can be expensive. + regrtest_opts.extend(['-j', '2']) + + # If HOSTRUNNER is set and -p/--python option is not given, then + # use hostrunner to execute python binary for tests. + if not self.python_cmd: + buildpython = sysconfig.get_config_var("BUILDPYTHON") + python_cmd = f"{hostrunner} {buildpython}" + regrtest_opts.extend(["--python", python_cmd]) + keep_environ = True + + return (environ, keep_environ) + + def _add_ci_python_opts(self, python_opts, keep_environ): + # --fast-ci and --slow-ci add options to Python: + # "-u -W default -bb -E" + + # Unbuffered stdout and stderr + if not sys.stdout.write_through: + python_opts.append('-u') + + # Add warnings filter 'default' + if 'default' not in sys.warnoptions: + python_opts.extend(('-W', 'default')) + + # Error on bytes/str comparison + if sys.flags.bytes_warning < 2: + python_opts.append('-bb') + + if not keep_environ: + # Ignore PYTHON* environment variables + if not sys.flags.ignore_environment: + python_opts.append('-E') + + def _execute_python(self, cmd, environ): + # Make sure that messages before execv() are logged + sys.stdout.flush() + sys.stderr.flush() + + cmd_text = shlex.join(cmd) + try: + print(f"+ {cmd_text}", flush=True) - sys.exit(exc.code) + if hasattr(os, 'execv') and not MS_WINDOWS: + os.execv(cmd[0], cmd) + # On success, execv() do no return. + # On error, it raises an OSError. + else: + import subprocess + with subprocess.Popen(cmd, env=environ) as proc: + try: + proc.wait() + except KeyboardInterrupt: + # There is no need to call proc.terminate(): on CTRL+C, + # SIGTERM is also sent to the child process. + try: + proc.wait(timeout=EXIT_TIMEOUT) + except subprocess.TimeoutExpired: + proc.kill() + proc.wait() + sys.exit(EXITCODE_INTERRUPTED) + + sys.exit(proc.returncode) + except Exception as exc: + print_warning(f"Failed to change Python options: {exc!r}\n" + f"Command: {cmd_text}") + # continue executing main() + + def _add_python_opts(self): + python_opts = [] + regrtest_opts = [] + + environ, keep_environ = self._add_cross_compile_opts(regrtest_opts) + if self.ci_mode: + self._add_ci_python_opts(python_opts, keep_environ) + + if (not python_opts) and (not regrtest_opts) and (environ is None): + # Nothing changed: nothing to do + return - def getloadavg(self): - if self.win_load_tracker is not None: - return self.win_load_tracker.getloadavg() + # Create new command line + cmd = list(sys.orig_argv) + if python_opts: + cmd[1:1] = python_opts + if regrtest_opts: + cmd.extend(regrtest_opts) + cmd.append("--dont-add-python-opts") - if hasattr(os, 'getloadavg'): - return os.getloadavg()[0] + self._execute_python(cmd, environ) - return None + def _init(self): + # Set sys.stdout encoder error handler to backslashreplace, + # similar to sys.stderr error handler, to avoid UnicodeEncodeError + # when printing a traceback or any other non-encodable character. + sys.stdout.reconfigure(errors="backslashreplace") - def _main(self, tests, kwargs): - if self.worker_test_name is not None: - from test.libregrtest.runtest_mp import run_tests_worker - run_tests_worker(self.ns, self.worker_test_name) + if self.junit_filename and not os.path.isabs(self.junit_filename): + self.junit_filename = os.path.abspath(self.junit_filename) - if self.ns.wait: - input("Press any key to continue...") + strip_py_suffix(self.cmdline_args) - support.PGO = self.ns.pgo - support.PGO_EXTENDED = self.ns.pgo_extended + self.tmp_dir = get_temp_dir(self.tmp_dir) - setup_tests(self.ns) + def main(self, tests: TestList | None = None): + if self.want_add_python_opts: + self._add_python_opts() - self.find_tests(tests) + self._init() - if self.ns.list_tests: - self.list_tests() + if self.want_cleanup: + cleanup_temp_dir(self.tmp_dir) sys.exit(0) - if self.ns.list_cases: - self.list_cases() - sys.exit(0) - - self.run_tests() - self.display_result() - - if self.ns.verbose2 and self.bad: - self.rerun_failed_tests() - - self.finalize() + if self.want_wait: + input("Press any key to continue...") - self.save_xml_result() + setup_test_dir(self.test_dir) + selected, tests = self.find_tests(tests) + + exitcode = 0 + if self.want_list_tests: + self.list_tests(selected) + elif self.want_list_cases: + list_cases(selected, + match_tests=self.match_tests, + ignore_tests=self.ignore_tests, + test_dir=self.test_dir) + else: + exitcode = self.run_tests(selected, tests) - if self.bad: - sys.exit(EXITCODE_BAD_TEST) - if self.interrupted: - sys.exit(EXITCODE_INTERRUPTED) - if self.ns.fail_env_changed and self.environment_changed: - sys.exit(EXITCODE_ENV_CHANGED) - if self.no_tests_run(): - sys.exit(EXITCODE_NO_TESTS_RAN) - sys.exit(0) + sys.exit(exitcode) -def main(tests=None, **kwargs): +def main(tests=None, _add_python_opts=False, **kwargs): """Run the Python suite.""" - Regrtest().main(tests=tests, **kwargs) + ns = _parse_args(sys.argv[1:], **kwargs) + Regrtest(ns, _add_python_opts=_add_python_opts).main(tests=tests) diff --git a/Lib/test/libregrtest/pgo.py b/Lib/test/libregrtest/pgo.py index 42ce5fba7a97c3..e3a6927be5db1d 100644 --- a/Lib/test/libregrtest/pgo.py +++ b/Lib/test/libregrtest/pgo.py @@ -42,15 +42,15 @@ 'test_set', 'test_sqlite3', 'test_statistics', + 'test_str', 'test_struct', 'test_tabnanny', 'test_time', - 'test_unicode', 'test_xml_etree', 'test_xml_etree_c', ] -def setup_pgo_tests(ns): - if not ns.args and not ns.pgo_extended: +def setup_pgo_tests(cmdline_args, pgo_extended: bool): + if not cmdline_args and not pgo_extended: # run default set of tests for PGO training - ns.args = PGO_TESTS[:] + cmdline_args[:] = PGO_TESTS[:] diff --git a/Lib/test/libregrtest/refleak.py b/Lib/test/libregrtest/refleak.py index 58a1419e0501df..59f48bd1e3cf78 100644 --- a/Lib/test/libregrtest/refleak.py +++ b/Lib/test/libregrtest/refleak.py @@ -1,10 +1,13 @@ -import os import sys import warnings from inspect import isabstract +from typing import Any + from test import support from test.support import os_helper -from test.libregrtest.utils import clear_caches + +from .runtests import HuntRefleak +from .utils import clear_caches try: from _abc import _get_dump @@ -19,7 +22,9 @@ def _get_dump(cls): cls._abc_negative_cache, cls._abc_negative_cache_version) -def dash_R(ns, test_name, test_func): +def runtest_refleak(test_name, test_func, + hunt_refleak: HuntRefleak, + quiet: bool): """Run a test multiple times, looking for reference leaks. Returns: @@ -41,6 +46,7 @@ def dash_R(ns, test_name, test_func): fs = warnings.filters[:] ps = copyreg.dispatch_table.copy() pic = sys.path_importer_cache.copy() + zdc: dict[str, Any] | None try: import zipimport except ImportError: @@ -62,9 +68,10 @@ def dash_R(ns, test_name, test_func): def get_pooled_int(value): return int_pool.setdefault(value, value) - nwarmup, ntracked, fname = ns.huntrleaks - fname = os.path.join(os_helper.SAVEDCWD, fname) - repcount = nwarmup + ntracked + warmups = hunt_refleak.warmups + runs = hunt_refleak.runs + filename = hunt_refleak.filename + repcount = warmups + runs # Pre-allocate to ensure that the loop doesn't allocate anything new rep_range = list(range(repcount)) @@ -73,12 +80,11 @@ def get_pooled_int(value): fd_deltas = [0] * repcount getallocatedblocks = sys.getallocatedblocks gettotalrefcount = sys.gettotalrefcount - _getquickenedcount = sys._getquickenedcount fd_count = os_helper.fd_count # initialize variables to make pyflakes quiet rc_before = alloc_before = fd_before = 0 - if not ns.quiet: + if not quiet: print("beginning", repcount, "repetitions", file=sys.stderr) print(("1234567890"*(repcount//10 + 1))[:repcount], file=sys.stderr, flush=True) @@ -93,12 +99,12 @@ def get_pooled_int(value): dash_R_cleanup(fs, ps, pic, zdc, abcs) support.gc_collect() - # Read memory statistics immediately after the garbage collection - alloc_after = getallocatedblocks() - _getquickenedcount() + # Read memory statistics immediately after the garbage collection. + alloc_after = getallocatedblocks() rc_after = gettotalrefcount() fd_after = fd_count() - if not ns.quiet: + if not quiet: print('.', end='', file=sys.stderr, flush=True) rc_deltas[i] = get_pooled_int(rc_after - rc_before) @@ -109,7 +115,7 @@ def get_pooled_int(value): rc_before = rc_after fd_before = fd_after - if not ns.quiet: + if not quiet: print(file=sys.stderr) # These checkers return False on success, True on failure @@ -138,12 +144,12 @@ def check_fd_deltas(deltas): (fd_deltas, 'file descriptors', check_fd_deltas) ]: # ignore warmup runs - deltas = deltas[nwarmup:] + deltas = deltas[warmups:] if checker(deltas): msg = '%s leaked %s %s, sum=%s' % ( test_name, deltas, item_name, sum(deltas)) print(msg, file=sys.stderr, flush=True) - with open(fname, "a", encoding="utf-8") as refrep: + with open(filename, "a", encoding="utf-8") as refrep: print(msg, file=refrep) refrep.flush() failed = True @@ -169,6 +175,7 @@ def dash_R_cleanup(fs, ps, pic, zdc, abcs): zipimport._zip_directory_cache.update(zdc) # Clear ABC registries, restoring previously saved ABC registries. + # ignore deprecation warning for collections.abc.ByteString abs_classes = [getattr(collections.abc, a) for a in collections.abc.__all__] abs_classes = filter(isabstract, abs_classes) for abc in abs_classes: diff --git a/Lib/test/libregrtest/result.py b/Lib/test/libregrtest/result.py new file mode 100644 index 00000000000000..d6b0d5ad383a5b --- /dev/null +++ b/Lib/test/libregrtest/result.py @@ -0,0 +1,190 @@ +import dataclasses +import json +from typing import Any + +from test.support import TestStats + +from .utils import ( + StrJSON, TestName, FilterTuple, + format_duration, normalize_test_name, print_warning) + + +# Avoid enum.Enum to reduce the number of imports when tests are run +class State: + PASSED = "PASSED" + FAILED = "FAILED" + SKIPPED = "SKIPPED" + UNCAUGHT_EXC = "UNCAUGHT_EXC" + REFLEAK = "REFLEAK" + ENV_CHANGED = "ENV_CHANGED" + RESOURCE_DENIED = "RESOURCE_DENIED" + INTERRUPTED = "INTERRUPTED" + WORKER_FAILED = "WORKER_FAILED" # non-zero worker process exit code + WORKER_BUG = "WORKER_BUG" # exception when running a worker + DID_NOT_RUN = "DID_NOT_RUN" + TIMEOUT = "TIMEOUT" + + @staticmethod + def is_failed(state): + return state in { + State.FAILED, + State.UNCAUGHT_EXC, + State.REFLEAK, + State.WORKER_FAILED, + State.WORKER_BUG, + State.TIMEOUT} + + @staticmethod + def has_meaningful_duration(state): + # Consider that the duration is meaningless for these cases. + # For example, if a whole test file is skipped, its duration + # is unlikely to be the duration of executing its tests, + # but just the duration to execute code which skips the test. + return state not in { + State.SKIPPED, + State.RESOURCE_DENIED, + State.INTERRUPTED, + State.WORKER_FAILED, + State.WORKER_BUG, + State.DID_NOT_RUN} + + @staticmethod + def must_stop(state): + return state in { + State.INTERRUPTED, + State.WORKER_BUG, + } + + +@dataclasses.dataclass(slots=True) +class TestResult: + test_name: TestName + state: str | None = None + # Test duration in seconds + duration: float | None = None + xml_data: list[str] | None = None + stats: TestStats | None = None + + # errors and failures copied from support.TestFailedWithDetails + errors: list[tuple[str, str]] | None = None + failures: list[tuple[str, str]] | None = None + + def is_failed(self, fail_env_changed: bool) -> bool: + if self.state == State.ENV_CHANGED: + return fail_env_changed + return State.is_failed(self.state) + + def _format_failed(self): + if self.errors and self.failures: + le = len(self.errors) + lf = len(self.failures) + error_s = "error" + ("s" if le > 1 else "") + failure_s = "failure" + ("s" if lf > 1 else "") + return f"{self.test_name} failed ({le} {error_s}, {lf} {failure_s})" + + if self.errors: + le = len(self.errors) + error_s = "error" + ("s" if le > 1 else "") + return f"{self.test_name} failed ({le} {error_s})" + + if self.failures: + lf = len(self.failures) + failure_s = "failure" + ("s" if lf > 1 else "") + return f"{self.test_name} failed ({lf} {failure_s})" + + return f"{self.test_name} failed" + + def __str__(self) -> str: + match self.state: + case State.PASSED: + return f"{self.test_name} passed" + case State.FAILED: + return self._format_failed() + case State.SKIPPED: + return f"{self.test_name} skipped" + case State.UNCAUGHT_EXC: + return f"{self.test_name} failed (uncaught exception)" + case State.REFLEAK: + return f"{self.test_name} failed (reference leak)" + case State.ENV_CHANGED: + return f"{self.test_name} failed (env changed)" + case State.RESOURCE_DENIED: + return f"{self.test_name} skipped (resource denied)" + case State.INTERRUPTED: + return f"{self.test_name} interrupted" + case State.WORKER_FAILED: + return f"{self.test_name} worker non-zero exit code" + case State.WORKER_BUG: + return f"{self.test_name} worker bug" + case State.DID_NOT_RUN: + return f"{self.test_name} ran no tests" + case State.TIMEOUT: + return f"{self.test_name} timed out ({format_duration(self.duration)})" + case _: + raise ValueError("unknown result state: {state!r}") + + def has_meaningful_duration(self): + return State.has_meaningful_duration(self.state) + + def set_env_changed(self): + if self.state is None or self.state == State.PASSED: + self.state = State.ENV_CHANGED + + def must_stop(self, fail_fast: bool, fail_env_changed: bool) -> bool: + if State.must_stop(self.state): + return True + if fail_fast and self.is_failed(fail_env_changed): + return True + return False + + def get_rerun_match_tests(self) -> FilterTuple | None: + match_tests = [] + + errors = self.errors or [] + failures = self.failures or [] + for error_list, is_error in ( + (errors, True), + (failures, False), + ): + for full_name, *_ in error_list: + match_name = normalize_test_name(full_name, is_error=is_error) + if match_name is None: + # 'setUpModule (test.test_sys)': don't filter tests + return None + if not match_name: + error_type = "ERROR" if is_error else "FAIL" + print_warning(f"rerun failed to parse {error_type} test name: " + f"{full_name!r}: don't filter tests") + return None + match_tests.append(match_name) + + if not match_tests: + return None + return tuple(match_tests) + + def write_json_into(self, file) -> None: + json.dump(self, file, cls=_EncodeTestResult) + + @staticmethod + def from_json(worker_json: StrJSON) -> 'TestResult': + return json.loads(worker_json, object_hook=_decode_test_result) + + +class _EncodeTestResult(json.JSONEncoder): + def default(self, o: Any) -> dict[str, Any]: + if isinstance(o, TestResult): + result = dataclasses.asdict(o) + result["__test_result__"] = o.__class__.__name__ + return result + else: + return super().default(o) + + +def _decode_test_result(data: dict[str, Any]) -> TestResult | dict[str, Any]: + if "__test_result__" in data: + data.pop('__test_result__') + if data['stats'] is not None: + data['stats'] = TestStats(**data['stats']) + return TestResult(**data) + else: + return data diff --git a/Lib/test/libregrtest/results.py b/Lib/test/libregrtest/results.py new file mode 100644 index 00000000000000..3708078ff0bf3a --- /dev/null +++ b/Lib/test/libregrtest/results.py @@ -0,0 +1,261 @@ +import sys +from test.support import TestStats + +from .runtests import RunTests +from .result import State, TestResult +from .utils import ( + StrPath, TestName, TestTuple, TestList, FilterDict, + printlist, count, format_duration) + + +# Python uses exit code 1 when an exception is not catched +# argparse.ArgumentParser.error() uses exit code 2 +EXITCODE_BAD_TEST = 2 +EXITCODE_ENV_CHANGED = 3 +EXITCODE_NO_TESTS_RAN = 4 +EXITCODE_RERUN_FAIL = 5 +EXITCODE_INTERRUPTED = 130 # 128 + signal.SIGINT=2 + + +class TestResults: + def __init__(self): + self.bad: TestList = [] + self.good: TestList = [] + self.rerun_bad: TestList = [] + self.skipped: TestList = [] + self.resource_denied: TestList = [] + self.env_changed: TestList = [] + self.run_no_tests: TestList = [] + self.rerun: TestList = [] + self.rerun_results: list[TestResult] = [] + + self.interrupted: bool = False + self.worker_bug: bool = False + self.test_times: list[tuple[float, TestName]] = [] + self.stats = TestStats() + # used by --junit-xml + self.testsuite_xml: list[str] = [] + + def is_all_good(self): + return (not self.bad + and not self.skipped + and not self.interrupted + and not self.worker_bug) + + def get_executed(self): + return (set(self.good) | set(self.bad) | set(self.skipped) + | set(self.resource_denied) | set(self.env_changed) + | set(self.run_no_tests)) + + def no_tests_run(self): + return not any((self.good, self.bad, self.skipped, self.interrupted, + self.env_changed)) + + def get_state(self, fail_env_changed): + state = [] + if self.bad: + state.append("FAILURE") + elif fail_env_changed and self.env_changed: + state.append("ENV CHANGED") + elif self.no_tests_run(): + state.append("NO TESTS RAN") + + if self.interrupted: + state.append("INTERRUPTED") + if self.worker_bug: + state.append("WORKER BUG") + if not state: + state.append("SUCCESS") + + return ', '.join(state) + + def get_exitcode(self, fail_env_changed, fail_rerun): + exitcode = 0 + if self.bad: + exitcode = EXITCODE_BAD_TEST + elif self.interrupted: + exitcode = EXITCODE_INTERRUPTED + elif fail_env_changed and self.env_changed: + exitcode = EXITCODE_ENV_CHANGED + elif self.no_tests_run(): + exitcode = EXITCODE_NO_TESTS_RAN + elif fail_rerun and self.rerun: + exitcode = EXITCODE_RERUN_FAIL + elif self.worker_bug: + exitcode = EXITCODE_BAD_TEST + return exitcode + + def accumulate_result(self, result: TestResult, runtests: RunTests): + test_name = result.test_name + rerun = runtests.rerun + fail_env_changed = runtests.fail_env_changed + + match result.state: + case State.PASSED: + self.good.append(test_name) + case State.ENV_CHANGED: + self.env_changed.append(test_name) + self.rerun_results.append(result) + case State.SKIPPED: + self.skipped.append(test_name) + case State.RESOURCE_DENIED: + self.resource_denied.append(test_name) + case State.INTERRUPTED: + self.interrupted = True + case State.DID_NOT_RUN: + self.run_no_tests.append(test_name) + case _: + if result.is_failed(fail_env_changed): + self.bad.append(test_name) + self.rerun_results.append(result) + else: + raise ValueError(f"invalid test state: {result.state!r}") + + if result.state == State.WORKER_BUG: + self.worker_bug = True + + if result.has_meaningful_duration() and not rerun: + self.test_times.append((result.duration, test_name)) + if result.stats is not None: + self.stats.accumulate(result.stats) + if rerun: + self.rerun.append(test_name) + + xml_data = result.xml_data + if xml_data: + self.add_junit(xml_data) + + def need_rerun(self): + return bool(self.rerun_results) + + def prepare_rerun(self) -> tuple[TestTuple, FilterDict]: + tests: TestList = [] + match_tests_dict = {} + for result in self.rerun_results: + tests.append(result.test_name) + + match_tests = result.get_rerun_match_tests() + # ignore empty match list + if match_tests: + match_tests_dict[result.test_name] = match_tests + + # Clear previously failed tests + self.rerun_bad.extend(self.bad) + self.bad.clear() + self.env_changed.clear() + self.rerun_results.clear() + + return (tuple(tests), match_tests_dict) + + def add_junit(self, xml_data: list[str]): + import xml.etree.ElementTree as ET + for e in xml_data: + try: + self.testsuite_xml.append(ET.fromstring(e)) + except ET.ParseError: + print(xml_data, file=sys.__stderr__) + raise + + def write_junit(self, filename: StrPath): + if not self.testsuite_xml: + # Don't create empty XML file + return + + import xml.etree.ElementTree as ET + root = ET.Element("testsuites") + + # Manually count the totals for the overall summary + totals = {'tests': 0, 'errors': 0, 'failures': 0} + for suite in self.testsuite_xml: + root.append(suite) + for k in totals: + try: + totals[k] += int(suite.get(k, 0)) + except ValueError: + pass + + for k, v in totals.items(): + root.set(k, str(v)) + + with open(filename, 'wb') as f: + for s in ET.tostringlist(root): + f.write(s) + + def display_result(self, tests: TestTuple, quiet: bool, print_slowest: bool): + if print_slowest: + self.test_times.sort(reverse=True) + print() + print("10 slowest tests:") + for test_time, test in self.test_times[:10]: + print("- %s: %s" % (test, format_duration(test_time))) + + all_tests = [] + omitted = set(tests) - self.get_executed() + + # less important + all_tests.append((omitted, "test", "{} omitted:")) + if not quiet: + all_tests.append((self.skipped, "test", "{} skipped:")) + all_tests.append((self.resource_denied, "test", "{} skipped (resource denied):")) + all_tests.append((self.run_no_tests, "test", "{} run no tests:")) + + # more important + all_tests.append((self.env_changed, "test", "{} altered the execution environment (env changed):")) + all_tests.append((self.rerun, "re-run test", "{}:")) + all_tests.append((self.bad, "test", "{} failed:")) + + for tests_list, count_text, title_format in all_tests: + if tests_list: + print() + count_text = count(len(tests_list), count_text) + print(title_format.format(count_text)) + printlist(tests_list) + + if self.good and not quiet: + print() + text = count(len(self.good), "test") + text = f"{text} OK." + if (self.is_all_good() and len(self.good) > 1): + text = f"All {text}" + print(text) + + if self.interrupted: + print() + print("Test suite interrupted by signal SIGINT.") + + def display_summary(self, first_runtests: RunTests, filtered: bool): + # Total tests + stats = self.stats + text = f'run={stats.tests_run:,}' + if filtered: + text = f"{text} (filtered)" + report = [text] + if stats.failures: + report.append(f'failures={stats.failures:,}') + if stats.skipped: + report.append(f'skipped={stats.skipped:,}') + print(f"Total tests: {' '.join(report)}") + + # Total test files + all_tests = [self.good, self.bad, self.rerun, + self.skipped, + self.env_changed, self.run_no_tests] + run = sum(map(len, all_tests)) + text = f'run={run}' + if not first_runtests.forever: + ntest = len(first_runtests.tests) + text = f"{text}/{ntest}" + if filtered: + text = f"{text} (filtered)" + report = [text] + for name, tests in ( + ('failed', self.bad), + ('env_changed', self.env_changed), + ('skipped', self.skipped), + ('resource_denied', self.resource_denied), + ('rerun', self.rerun), + ('run_no_tests', self.run_no_tests), + ): + if tests: + report.append(f'{name}={len(tests)}') + print(f"Total test files: {' '.join(report)}") diff --git a/Lib/test/libregrtest/run_workers.py b/Lib/test/libregrtest/run_workers.py new file mode 100644 index 00000000000000..16f8331abd32f9 --- /dev/null +++ b/Lib/test/libregrtest/run_workers.py @@ -0,0 +1,607 @@ +import contextlib +import dataclasses +import faulthandler +import os.path +import queue +import signal +import subprocess +import sys +import tempfile +import threading +import time +import traceback +from typing import Literal, TextIO + +from test import support +from test.support import os_helper, MS_WINDOWS + +from .logger import Logger +from .result import TestResult, State +from .results import TestResults +from .runtests import RunTests, JsonFile, JsonFileType +from .single import PROGRESS_MIN_TIME +from .utils import ( + StrPath, TestName, + format_duration, print_warning, count, plural, get_signal_name) +from .worker import create_worker_process, USE_PROCESS_GROUP + +if MS_WINDOWS: + import locale + import msvcrt + + + +# Display the running tests if nothing happened last N seconds +PROGRESS_UPDATE = 30.0 # seconds +assert PROGRESS_UPDATE >= PROGRESS_MIN_TIME + +# Kill the main process after 5 minutes. It is supposed to write an update +# every PROGRESS_UPDATE seconds. Tolerate 5 minutes for Python slowest +# buildbot workers. +MAIN_PROCESS_TIMEOUT = 5 * 60.0 +assert MAIN_PROCESS_TIMEOUT >= PROGRESS_UPDATE + +# Time to wait until a worker completes: should be immediate +WAIT_COMPLETED_TIMEOUT = 30.0 # seconds + +# Time to wait a killed process (in seconds) +WAIT_KILLED_TIMEOUT = 60.0 + + +# We do not use a generator so multiple threads can call next(). +class MultiprocessIterator: + + """A thread-safe iterator over tests for multiprocess mode.""" + + def __init__(self, tests_iter): + self.lock = threading.Lock() + self.tests_iter = tests_iter + + def __iter__(self): + return self + + def __next__(self): + with self.lock: + if self.tests_iter is None: + raise StopIteration + return next(self.tests_iter) + + def stop(self): + with self.lock: + self.tests_iter = None + + +@dataclasses.dataclass(slots=True, frozen=True) +class MultiprocessResult: + result: TestResult + # bpo-45410: stderr is written into stdout to keep messages order + worker_stdout: str | None = None + err_msg: str | None = None + + +ExcStr = str +QueueOutput = tuple[Literal[False], MultiprocessResult] | tuple[Literal[True], ExcStr] + + +class ExitThread(Exception): + pass + + +class WorkerError(Exception): + def __init__(self, + test_name: TestName, + err_msg: str | None, + stdout: str | None, + state: str): + result = TestResult(test_name, state=state) + self.mp_result = MultiprocessResult(result, stdout, err_msg) + super().__init__() + + +class WorkerThread(threading.Thread): + def __init__(self, worker_id: int, runner: "RunWorkers") -> None: + super().__init__() + self.worker_id = worker_id + self.runtests = runner.runtests + self.pending = runner.pending + self.output = runner.output + self.timeout = runner.worker_timeout + self.log = runner.log + self.test_name: TestName | None = None + self.start_time: float | None = None + self._popen: subprocess.Popen[str] | None = None + self._killed = False + self._stopped = False + + def __repr__(self) -> str: + info = [f'WorkerThread #{self.worker_id}'] + if self.is_alive(): + info.append("running") + else: + info.append('stopped') + test = self.test_name + if test: + info.append(f'test={test}') + popen = self._popen + if popen is not None: + dt = time.monotonic() - self.start_time + info.extend((f'pid={self._popen.pid}', + f'time={format_duration(dt)}')) + return '<%s>' % ' '.join(info) + + def _kill(self) -> None: + popen = self._popen + if popen is None: + return + + if self._killed: + return + self._killed = True + + if USE_PROCESS_GROUP: + what = f"{self} process group" + else: + what = f"{self} process" + + print(f"Kill {what}", file=sys.stderr, flush=True) + try: + if USE_PROCESS_GROUP: + os.killpg(popen.pid, signal.SIGKILL) + else: + popen.kill() + except ProcessLookupError: + # popen.kill(): the process completed, the WorkerThread thread + # read its exit status, but Popen.send_signal() read the returncode + # just before Popen.wait() set returncode. + pass + except OSError as exc: + print_warning(f"Failed to kill {what}: {exc!r}") + + def stop(self) -> None: + # Method called from a different thread to stop this thread + self._stopped = True + self._kill() + + def _run_process(self, runtests: RunTests, output_fd: int, + tmp_dir: StrPath | None = None) -> int | None: + popen = create_worker_process(runtests, output_fd, tmp_dir) + self._popen = popen + self._killed = False + + try: + if self._stopped: + # If kill() has been called before self._popen is set, + # self._popen is still running. Call again kill() + # to ensure that the process is killed. + self._kill() + raise ExitThread + + try: + # gh-94026: stdout+stderr are written to tempfile + retcode = popen.wait(timeout=self.timeout) + assert retcode is not None + return retcode + except subprocess.TimeoutExpired: + if self._stopped: + # kill() has been called: communicate() fails on reading + # closed stdout + raise ExitThread + + # On timeout, kill the process + self._kill() + + # None means TIMEOUT for the caller + retcode = None + # bpo-38207: Don't attempt to call communicate() again: on it + # can hang until all child processes using stdout + # pipes completes. + except OSError: + if self._stopped: + # kill() has been called: communicate() fails + # on reading closed stdout + raise ExitThread + raise + except: + self._kill() + raise + finally: + self._wait_completed() + self._popen = None + + def create_stdout(self, stack: contextlib.ExitStack) -> TextIO: + """Create stdout temporay file (file descriptor).""" + + if MS_WINDOWS: + # gh-95027: When stdout is not a TTY, Python uses the ANSI code + # page for the sys.stdout encoding. If the main process runs in a + # terminal, sys.stdout uses WindowsConsoleIO with UTF-8 encoding. + encoding = locale.getencoding() + else: + encoding = sys.stdout.encoding + + # gh-94026: Write stdout+stderr to a tempfile as workaround for + # non-blocking pipes on Emscripten with NodeJS. + # gh-109425: Use "backslashreplace" error handler: log corrupted + # stdout+stderr, instead of failing with a UnicodeDecodeError and not + # logging stdout+stderr at all. + stdout_file = tempfile.TemporaryFile('w+', + encoding=encoding, + errors='backslashreplace') + stack.enter_context(stdout_file) + return stdout_file + + def create_json_file(self, stack: contextlib.ExitStack) -> tuple[JsonFile, TextIO | None]: + """Create JSON file.""" + + json_file_use_stdout = self.runtests.json_file_use_stdout() + if json_file_use_stdout: + json_file = JsonFile(None, JsonFileType.STDOUT) + json_tmpfile = None + else: + json_tmpfile = tempfile.TemporaryFile('w+', encoding='utf8') + stack.enter_context(json_tmpfile) + + json_fd = json_tmpfile.fileno() + if MS_WINDOWS: + json_handle = msvcrt.get_osfhandle(json_fd) + json_file = JsonFile(json_handle, + JsonFileType.WINDOWS_HANDLE) + else: + json_file = JsonFile(json_fd, JsonFileType.UNIX_FD) + return (json_file, json_tmpfile) + + def create_worker_runtests(self, test_name: TestName, json_file: JsonFile) -> RunTests: + """Create the worker RunTests.""" + + tests = (test_name,) + if self.runtests.rerun: + match_tests = self.runtests.get_match_tests(test_name) + else: + match_tests = None + + kwargs = {} + if match_tests: + kwargs['match_tests'] = match_tests + if self.runtests.output_on_failure: + kwargs['verbose'] = True + kwargs['output_on_failure'] = False + return self.runtests.copy( + tests=tests, + json_file=json_file, + **kwargs) + + def run_tmp_files(self, worker_runtests: RunTests, + stdout_fd: int) -> tuple[int | None, list[StrPath]]: + # gh-93353: Check for leaked temporary files in the parent process, + # since the deletion of temporary files can happen late during + # Python finalization: too late for libregrtest. + if not support.is_wasi: + # Don't check for leaked temporary files and directories if Python is + # run on WASI. WASI don't pass environment variables like TMPDIR to + # worker processes. + tmp_dir = tempfile.mkdtemp(prefix="test_python_") + tmp_dir = os.path.abspath(tmp_dir) + try: + retcode = self._run_process(worker_runtests, + stdout_fd, tmp_dir) + finally: + tmp_files = os.listdir(tmp_dir) + os_helper.rmtree(tmp_dir) + else: + retcode = self._run_process(worker_runtests, stdout_fd) + tmp_files = [] + + return (retcode, tmp_files) + + def read_stdout(self, stdout_file: TextIO) -> str: + stdout_file.seek(0) + try: + return stdout_file.read().strip() + except Exception as exc: + # gh-101634: Catch UnicodeDecodeError if stdout cannot be + # decoded from encoding + raise WorkerError(self.test_name, + f"Cannot read process stdout: {exc}", + stdout=None, + state=State.WORKER_BUG) + + def read_json(self, json_file: JsonFile, json_tmpfile: TextIO | None, + stdout: str) -> tuple[TestResult, str]: + try: + if json_tmpfile is not None: + json_tmpfile.seek(0) + worker_json = json_tmpfile.read() + elif json_file.file_type == JsonFileType.STDOUT: + stdout, _, worker_json = stdout.rpartition("\n") + stdout = stdout.rstrip() + else: + with json_file.open(encoding='utf8') as json_fp: + worker_json = json_fp.read() + except Exception as exc: + # gh-101634: Catch UnicodeDecodeError if stdout cannot be + # decoded from encoding + err_msg = f"Failed to read worker process JSON: {exc}" + raise WorkerError(self.test_name, err_msg, stdout, + state=State.WORKER_BUG) + + if not worker_json: + raise WorkerError(self.test_name, "empty JSON", stdout, + state=State.WORKER_BUG) + + try: + result = TestResult.from_json(worker_json) + except Exception as exc: + # gh-101634: Catch UnicodeDecodeError if stdout cannot be + # decoded from encoding + err_msg = f"Failed to parse worker process JSON: {exc}" + raise WorkerError(self.test_name, err_msg, stdout, + state=State.WORKER_BUG) + + return (result, stdout) + + def _runtest(self, test_name: TestName) -> MultiprocessResult: + with contextlib.ExitStack() as stack: + stdout_file = self.create_stdout(stack) + json_file, json_tmpfile = self.create_json_file(stack) + worker_runtests = self.create_worker_runtests(test_name, json_file) + + retcode, tmp_files = self.run_tmp_files(worker_runtests, + stdout_file.fileno()) + + stdout = self.read_stdout(stdout_file) + + if retcode is None: + raise WorkerError(self.test_name, stdout=stdout, + err_msg=None, + state=State.TIMEOUT) + if retcode != 0: + name = get_signal_name(retcode) + if name: + retcode = f"{retcode} ({name})" + raise WorkerError(self.test_name, f"Exit code {retcode}", stdout, + state=State.WORKER_FAILED) + + result, stdout = self.read_json(json_file, json_tmpfile, stdout) + + if tmp_files: + msg = (f'\n\n' + f'Warning -- {test_name} leaked temporary files ' + f'({len(tmp_files)}): {", ".join(sorted(tmp_files))}') + stdout += msg + result.set_env_changed() + + return MultiprocessResult(result, stdout) + + def run(self) -> None: + fail_fast = self.runtests.fail_fast + fail_env_changed = self.runtests.fail_env_changed + while not self._stopped: + try: + try: + test_name = next(self.pending) + except StopIteration: + break + + self.start_time = time.monotonic() + self.test_name = test_name + try: + mp_result = self._runtest(test_name) + except WorkerError as exc: + mp_result = exc.mp_result + finally: + self.test_name = None + mp_result.result.duration = time.monotonic() - self.start_time + self.output.put((False, mp_result)) + + if mp_result.result.must_stop(fail_fast, fail_env_changed): + break + except ExitThread: + break + except BaseException: + self.output.put((True, traceback.format_exc())) + break + + def _wait_completed(self) -> None: + popen = self._popen + + try: + popen.wait(WAIT_COMPLETED_TIMEOUT) + except (subprocess.TimeoutExpired, OSError) as exc: + print_warning(f"Failed to wait for {self} completion " + f"(timeout={format_duration(WAIT_COMPLETED_TIMEOUT)}): " + f"{exc!r}") + + def wait_stopped(self, start_time: float) -> None: + # bpo-38207: RunWorkers.stop_workers() called self.stop() + # which killed the process. Sometimes, killing the process from the + # main thread does not interrupt popen.communicate() in + # WorkerThread thread. This loop with a timeout is a workaround + # for that. + # + # Moreover, if this method fails to join the thread, it is likely + # that Python will hang at exit while calling threading._shutdown() + # which tries again to join the blocked thread. Regrtest.main() + # uses EXIT_TIMEOUT to workaround this second bug. + while True: + # Write a message every second + self.join(1.0) + if not self.is_alive(): + break + dt = time.monotonic() - start_time + self.log(f"Waiting for {self} thread for {format_duration(dt)}") + if dt > WAIT_KILLED_TIMEOUT: + print_warning(f"Failed to join {self} in {format_duration(dt)}") + break + + +def get_running(workers: list[WorkerThread]) -> str | None: + running: list[str] = [] + for worker in workers: + test_name = worker.test_name + if not test_name: + continue + dt = time.monotonic() - worker.start_time + if dt >= PROGRESS_MIN_TIME: + text = f'{test_name} ({format_duration(dt)})' + running.append(text) + if not running: + return None + return f"running ({len(running)}): {', '.join(running)}" + + +class RunWorkers: + def __init__(self, num_workers: int, runtests: RunTests, + logger: Logger, results: TestResults) -> None: + self.num_workers = num_workers + self.runtests = runtests + self.log = logger.log + self.display_progress = logger.display_progress + self.results: TestResults = results + + self.output: queue.Queue[QueueOutput] = queue.Queue() + tests_iter = runtests.iter_tests() + self.pending = MultiprocessIterator(tests_iter) + self.timeout = runtests.timeout + if self.timeout is not None: + # Rely on faulthandler to kill a worker process. This timouet is + # when faulthandler fails to kill a worker process. Give a maximum + # of 5 minutes to faulthandler to kill the worker. + self.worker_timeout: float | None = min(self.timeout * 1.5, self.timeout + 5 * 60) + else: + self.worker_timeout = None + self.workers: list[WorkerThread] | None = None + + jobs = self.runtests.get_jobs() + if jobs is not None: + # Don't spawn more threads than the number of jobs: + # these worker threads would never get anything to do. + self.num_workers = min(self.num_workers, jobs) + + def start_workers(self) -> None: + self.workers = [WorkerThread(index, self) + for index in range(1, self.num_workers + 1)] + jobs = self.runtests.get_jobs() + if jobs is not None: + tests = count(jobs, 'test') + else: + tests = 'tests' + nworkers = len(self.workers) + processes = plural(nworkers, "process", "processes") + msg = (f"Run {tests} in parallel using " + f"{nworkers} worker {processes}") + if self.timeout: + msg += (" (timeout: %s, worker timeout: %s)" + % (format_duration(self.timeout), + format_duration(self.worker_timeout))) + self.log(msg) + for worker in self.workers: + worker.start() + + def stop_workers(self) -> None: + start_time = time.monotonic() + for worker in self.workers: + worker.stop() + for worker in self.workers: + worker.wait_stopped(start_time) + + def _get_result(self) -> QueueOutput | None: + pgo = self.runtests.pgo + use_faulthandler = (self.timeout is not None) + + # bpo-46205: check the status of workers every iteration to avoid + # waiting forever on an empty queue. + while any(worker.is_alive() for worker in self.workers): + if use_faulthandler: + faulthandler.dump_traceback_later(MAIN_PROCESS_TIMEOUT, + exit=True) + + # wait for a thread + try: + return self.output.get(timeout=PROGRESS_UPDATE) + except queue.Empty: + pass + + if not pgo: + # display progress + running = get_running(self.workers) + if running: + self.log(running) + + # all worker threads are done: consume pending results + try: + return self.output.get(timeout=0) + except queue.Empty: + return None + + def display_result(self, mp_result: MultiprocessResult) -> None: + result = mp_result.result + pgo = self.runtests.pgo + + text = str(result) + if mp_result.err_msg: + # WORKER_BUG + text += ' (%s)' % mp_result.err_msg + elif (result.duration >= PROGRESS_MIN_TIME and not pgo): + text += ' (%s)' % format_duration(result.duration) + if not pgo: + running = get_running(self.workers) + if running: + text += f' -- {running}' + self.display_progress(self.test_index, text) + + def _process_result(self, item: QueueOutput) -> TestResult: + """Returns True if test runner must stop.""" + if item[0]: + # Thread got an exception + format_exc = item[1] + print_warning(f"regrtest worker thread failed: {format_exc}") + result = TestResult("", state=State.WORKER_BUG) + self.results.accumulate_result(result, self.runtests) + return result + + self.test_index += 1 + mp_result = item[1] + result = mp_result.result + self.results.accumulate_result(result, self.runtests) + self.display_result(mp_result) + + # Display worker stdout + if not self.runtests.output_on_failure: + show_stdout = True + else: + # --verbose3 ignores stdout on success + show_stdout = (result.state != State.PASSED) + if show_stdout: + stdout = mp_result.worker_stdout + if stdout: + print(stdout, flush=True) + + return result + + def run(self) -> None: + fail_fast = self.runtests.fail_fast + fail_env_changed = self.runtests.fail_env_changed + + self.start_workers() + + self.test_index = 0 + try: + while True: + item = self._get_result() + if item is None: + break + + result = self._process_result(item) + if result.must_stop(fail_fast, fail_env_changed): + break + except KeyboardInterrupt: + print() + self.results.interrupted = True + finally: + if self.timeout is not None: + faulthandler.cancel_dump_traceback_later() + + # Always ensure that all worker processes are no longer + # worker when we exit this function + self.pending.stop() + self.stop_workers() diff --git a/Lib/test/libregrtest/runtest.py b/Lib/test/libregrtest/runtest.py deleted file mode 100644 index f37093b81cf284..00000000000000 --- a/Lib/test/libregrtest/runtest.py +++ /dev/null @@ -1,479 +0,0 @@ -import dataclasses -import doctest -import faulthandler -import functools -import gc -import importlib -import io -import os -import sys -import time -import traceback -import unittest - -from test import support -from test.support import TestStats -from test.support import os_helper -from test.support import threading_helper -from test.libregrtest.cmdline import Namespace -from test.libregrtest.save_env import saved_test_environment -from test.libregrtest.utils import clear_caches, format_duration, print_warning - - -# Avoid enum.Enum to reduce the number of imports when tests are run -class State: - PASSED = "PASSED" - FAILED = "FAILED" - SKIPPED = "SKIPPED" - UNCAUGHT_EXC = "UNCAUGHT_EXC" - REFLEAK = "REFLEAK" - ENV_CHANGED = "ENV_CHANGED" - RESOURCE_DENIED = "RESOURCE_DENIED" - INTERRUPTED = "INTERRUPTED" - MULTIPROCESSING_ERROR = "MULTIPROCESSING_ERROR" - DID_NOT_RUN = "DID_NOT_RUN" - TIMEOUT = "TIMEOUT" - - @staticmethod - def is_failed(state): - return state in { - State.FAILED, - State.UNCAUGHT_EXC, - State.REFLEAK, - State.MULTIPROCESSING_ERROR, - State.TIMEOUT} - - @staticmethod - def has_meaningful_duration(state): - # Consider that the duration is meaningless for these cases. - # For example, if a whole test file is skipped, its duration - # is unlikely to be the duration of executing its tests, - # but just the duration to execute code which skips the test. - return state not in { - State.SKIPPED, - State.RESOURCE_DENIED, - State.INTERRUPTED, - State.MULTIPROCESSING_ERROR, - State.DID_NOT_RUN} - - -@dataclasses.dataclass(slots=True) -class TestResult: - test_name: str - state: str | None = None - # Test duration in seconds - duration: float | None = None - xml_data: list[str] | None = None - stats: TestStats | None = None - - # errors and failures copied from support.TestFailedWithDetails - errors: list[tuple[str, str]] | None = None - failures: list[tuple[str, str]] | None = None - - def is_failed(self, fail_env_changed: bool) -> bool: - if self.state == State.ENV_CHANGED: - return fail_env_changed - return State.is_failed(self.state) - - def _format_failed(self): - if self.errors and self.failures: - le = len(self.errors) - lf = len(self.failures) - error_s = "error" + ("s" if le > 1 else "") - failure_s = "failure" + ("s" if lf > 1 else "") - return f"{self.test_name} failed ({le} {error_s}, {lf} {failure_s})" - - if self.errors: - le = len(self.errors) - error_s = "error" + ("s" if le > 1 else "") - return f"{self.test_name} failed ({le} {error_s})" - - if self.failures: - lf = len(self.failures) - failure_s = "failure" + ("s" if lf > 1 else "") - return f"{self.test_name} failed ({lf} {failure_s})" - - return f"{self.test_name} failed" - - def __str__(self) -> str: - match self.state: - case State.PASSED: - return f"{self.test_name} passed" - case State.FAILED: - return self._format_failed() - case State.SKIPPED: - return f"{self.test_name} skipped" - case State.UNCAUGHT_EXC: - return f"{self.test_name} failed (uncaught exception)" - case State.REFLEAK: - return f"{self.test_name} failed (reference leak)" - case State.ENV_CHANGED: - return f"{self.test_name} failed (env changed)" - case State.RESOURCE_DENIED: - return f"{self.test_name} skipped (resource denied)" - case State.INTERRUPTED: - return f"{self.test_name} interrupted" - case State.MULTIPROCESSING_ERROR: - return f"{self.test_name} process crashed" - case State.DID_NOT_RUN: - return f"{self.test_name} ran no tests" - case State.TIMEOUT: - return f"{self.test_name} timed out ({format_duration(self.duration)})" - case _: - raise ValueError("unknown result state: {state!r}") - - def has_meaningful_duration(self): - return State.has_meaningful_duration(self.state) - - def set_env_changed(self): - if self.state is None or self.state == State.PASSED: - self.state = State.ENV_CHANGED - - -# Minimum duration of a test to display its duration or to mention that -# the test is running in background -PROGRESS_MIN_TIME = 30.0 # seconds - -#If these test directories are encountered recurse into them and treat each -# test_ .py or dir as a separate test module. This can increase parallelism. -# Beware this can't generally be done for any directory with sub-tests as the -# __init__.py may do things which alter what tests are to be run. - -SPLITTESTDIRS = { - "test_asyncio", - "test_concurrent_futures", - "test_future_stmt", - "test_gdb", - "test_multiprocessing_fork", - "test_multiprocessing_forkserver", - "test_multiprocessing_spawn", -} - -# Storage of uncollectable objects -FOUND_GARBAGE = [] - - -def findtestdir(path=None): - return path or os.path.dirname(os.path.dirname(__file__)) or os.curdir - - -def findtests(*, testdir=None, exclude=(), - split_test_dirs=SPLITTESTDIRS, base_mod=""): - """Return a list of all applicable test modules.""" - testdir = findtestdir(testdir) - tests = [] - for name in os.listdir(testdir): - mod, ext = os.path.splitext(name) - if (not mod.startswith("test_")) or (mod in exclude): - continue - if mod in split_test_dirs: - subdir = os.path.join(testdir, mod) - mod = f"{base_mod or 'test'}.{mod}" - tests.extend(findtests(testdir=subdir, exclude=exclude, - split_test_dirs=split_test_dirs, base_mod=mod)) - elif ext in (".py", ""): - tests.append(f"{base_mod}.{mod}" if base_mod else mod) - return sorted(tests) - - -def split_test_packages(tests, *, testdir=None, exclude=(), - split_test_dirs=SPLITTESTDIRS): - testdir = findtestdir(testdir) - splitted = [] - for name in tests: - if name in split_test_dirs: - subdir = os.path.join(testdir, name) - splitted.extend(findtests(testdir=subdir, exclude=exclude, - split_test_dirs=split_test_dirs, - base_mod=name)) - else: - splitted.append(name) - return splitted - - -def get_abs_module(ns: Namespace, test_name: str) -> str: - if test_name.startswith('test.') or ns.testdir: - return test_name - else: - # Import it from the test package - return 'test.' + test_name - - -def _runtest_capture_output_timeout_junit(result: TestResult, ns: Namespace) -> None: - # Capture stdout and stderr, set faulthandler timeout, - # and create JUnit XML report. - - output_on_failure = ns.verbose3 - - use_timeout = ( - ns.timeout is not None and threading_helper.can_start_thread - ) - if use_timeout: - faulthandler.dump_traceback_later(ns.timeout, exit=True) - - try: - support.set_match_tests(ns.match_tests, ns.ignore_tests) - support.junit_xml_list = xml_list = [] if ns.xmlpath else None - if ns.failfast: - support.failfast = True - - if output_on_failure: - support.verbose = True - - stream = io.StringIO() - orig_stdout = sys.stdout - orig_stderr = sys.stderr - print_warning = support.print_warning - orig_print_warnings_stderr = print_warning.orig_stderr - - output = None - try: - sys.stdout = stream - sys.stderr = stream - # print_warning() writes into the temporary stream to preserve - # messages order. If support.environment_altered becomes true, - # warnings will be written to sys.stderr below. - print_warning.orig_stderr = stream - - _runtest_env_changed_exc(result, ns, display_failure=False) - # Ignore output if the test passed successfully - if result.state != State.PASSED: - output = stream.getvalue() - finally: - sys.stdout = orig_stdout - sys.stderr = orig_stderr - print_warning.orig_stderr = orig_print_warnings_stderr - - if output is not None: - sys.stderr.write(output) - sys.stderr.flush() - else: - # Tell tests to be moderately quiet - support.verbose = ns.verbose - - _runtest_env_changed_exc(result, ns, - display_failure=not ns.verbose) - - if xml_list: - import xml.etree.ElementTree as ET - result.xml_data = [ET.tostring(x).decode('us-ascii') - for x in xml_list] - finally: - if use_timeout: - faulthandler.cancel_dump_traceback_later() - support.junit_xml_list = None - - -def runtest(ns: Namespace, test_name: str) -> TestResult: - """Run a single test. - - ns -- regrtest namespace of options - test_name -- the name of the test - - Returns a TestResult. - - If ns.xmlpath is not None, xml_data is a list containing each - generated testsuite element. - """ - start_time = time.perf_counter() - result = TestResult(test_name) - try: - _runtest_capture_output_timeout_junit(result, ns) - except: - if not ns.pgo: - msg = traceback.format_exc() - print(f"test {test_name} crashed -- {msg}", - file=sys.stderr, flush=True) - result.state = State.UNCAUGHT_EXC - result.duration = time.perf_counter() - start_time - return result - - -def _test_module(the_module): - loader = unittest.TestLoader() - tests = loader.loadTestsFromModule(the_module) - for error in loader.errors: - print(error, file=sys.stderr) - if loader.errors: - raise Exception("errors while loading tests") - return support.run_unittest(tests) - - -def save_env(ns: Namespace, test_name: str): - return saved_test_environment(test_name, ns.verbose, ns.quiet, pgo=ns.pgo) - - -def regrtest_runner(result, test_func, ns) -> None: - # Run test_func(), collect statistics, and detect reference and memory - # leaks. - - if ns.huntrleaks: - from test.libregrtest.refleak import dash_R - refleak, test_result = dash_R(ns, result.test_name, test_func) - else: - test_result = test_func() - refleak = False - - if refleak: - result.state = State.REFLEAK - - match test_result: - case TestStats(): - stats = test_result - case unittest.TestResult(): - stats = TestStats.from_unittest(test_result) - case doctest.TestResults(): - stats = TestStats.from_doctest(test_result) - case None: - print_warning(f"{result.test_name} test runner returned None: {test_func}") - stats = None - case _: - print_warning(f"Unknown test result type: {type(test_result)}") - stats = None - - result.stats = stats - - -def _load_run_test(result: TestResult, ns: Namespace) -> None: - # Load the test function, run the test function. - - abstest = get_abs_module(ns, result.test_name) - - # remove the module from sys.module to reload it if it was already imported - try: - del sys.modules[abstest] - except KeyError: - pass - - the_module = importlib.import_module(abstest) - - if hasattr(the_module, "test_main"): - # https://github.com/python/cpython/issues/89392 - raise Exception(f"Module {result.test_name} defines test_main() which is no longer supported by regrtest") - test_func = functools.partial(_test_module, the_module) - - try: - with save_env(ns, result.test_name): - regrtest_runner(result, test_func, ns) - finally: - # First kill any dangling references to open files etc. - # This can also issue some ResourceWarnings which would otherwise get - # triggered during the following test run, and possibly produce - # failures. - support.gc_collect() - - cleanup_test_droppings(result.test_name, ns.verbose) - - if gc.garbage: - support.environment_altered = True - print_warning(f"{result.test_name} created {len(gc.garbage)} " - f"uncollectable object(s).") - - # move the uncollectable objects somewhere, - # so we don't see them again - FOUND_GARBAGE.extend(gc.garbage) - gc.garbage.clear() - - support.reap_children() - - -def _runtest_env_changed_exc(result: TestResult, ns: Namespace, - display_failure: bool = True) -> None: - # Detect environment changes, handle exceptions. - - # Reset the environment_altered flag to detect if a test altered - # the environment - support.environment_altered = False - - if ns.pgo: - display_failure = False - - test_name = result.test_name - try: - clear_caches() - support.gc_collect() - - with save_env(ns, test_name): - _load_run_test(result, ns) - except support.ResourceDenied as msg: - if not ns.quiet and not ns.pgo: - print(f"{test_name} skipped -- {msg}", flush=True) - result.state = State.RESOURCE_DENIED - return - except unittest.SkipTest as msg: - if not ns.quiet and not ns.pgo: - print(f"{test_name} skipped -- {msg}", flush=True) - result.state = State.SKIPPED - return - except support.TestFailedWithDetails as exc: - msg = f"test {test_name} failed" - if display_failure: - msg = f"{msg} -- {exc}" - print(msg, file=sys.stderr, flush=True) - result.state = State.FAILED - result.errors = exc.errors - result.failures = exc.failures - result.stats = exc.stats - return - except support.TestFailed as exc: - msg = f"test {test_name} failed" - if display_failure: - msg = f"{msg} -- {exc}" - print(msg, file=sys.stderr, flush=True) - result.state = State.FAILED - result.stats = exc.stats - return - except support.TestDidNotRun: - result.state = State.DID_NOT_RUN - return - except KeyboardInterrupt: - print() - result.state = State.INTERRUPTED - return - except: - if not ns.pgo: - msg = traceback.format_exc() - print(f"test {test_name} crashed -- {msg}", - file=sys.stderr, flush=True) - result.state = State.UNCAUGHT_EXC - return - - if support.environment_altered: - result.set_env_changed() - # Don't override the state if it was already set (REFLEAK or ENV_CHANGED) - if result.state is None: - result.state = State.PASSED - - -def cleanup_test_droppings(test_name: str, verbose: int) -> None: - # Try to clean up junk commonly left behind. While tests shouldn't leave - # any files or directories behind, when a test fails that can be tedious - # for it to arrange. The consequences can be especially nasty on Windows, - # since if a test leaves a file open, it cannot be deleted by name (while - # there's nothing we can do about that here either, we can display the - # name of the offending test, which is a real help). - for name in (os_helper.TESTFN,): - if not os.path.exists(name): - continue - - if os.path.isdir(name): - import shutil - kind, nuker = "directory", shutil.rmtree - elif os.path.isfile(name): - kind, nuker = "file", os.unlink - else: - raise RuntimeError(f"os.path says {name!r} exists but is neither " - f"directory nor file") - - if verbose: - print_warning(f"{test_name} left behind {kind} {name!r}") - support.environment_altered = True - - try: - import stat - # fix possible permissions problems that might prevent cleanup - os.chmod(name, stat.S_IRWXU | stat.S_IRWXG | stat.S_IRWXO) - nuker(name) - except Exception as exc: - print_warning(f"{test_name} left behind {kind} {name!r} " - f"and it couldn't be removed: {exc}") diff --git a/Lib/test/libregrtest/runtest_mp.py b/Lib/test/libregrtest/runtest_mp.py deleted file mode 100644 index fb1f80b0c054e3..00000000000000 --- a/Lib/test/libregrtest/runtest_mp.py +++ /dev/null @@ -1,564 +0,0 @@ -import dataclasses -import faulthandler -import json -import os.path -import queue -import signal -import subprocess -import sys -import tempfile -import threading -import time -import traceback -from typing import NamedTuple, NoReturn, Literal, Any, TextIO - -from test import support -from test.support import os_helper -from test.support import TestStats - -from test.libregrtest.cmdline import Namespace -from test.libregrtest.main import Regrtest -from test.libregrtest.runtest import ( - runtest, TestResult, State, - PROGRESS_MIN_TIME) -from test.libregrtest.setup import setup_tests -from test.libregrtest.utils import format_duration, print_warning - -if sys.platform == 'win32': - import locale - - -# Display the running tests if nothing happened last N seconds -PROGRESS_UPDATE = 30.0 # seconds -assert PROGRESS_UPDATE >= PROGRESS_MIN_TIME - -# Kill the main process after 5 minutes. It is supposed to write an update -# every PROGRESS_UPDATE seconds. Tolerate 5 minutes for Python slowest -# buildbot workers. -MAIN_PROCESS_TIMEOUT = 5 * 60.0 -assert MAIN_PROCESS_TIMEOUT >= PROGRESS_UPDATE - -# Time to wait until a worker completes: should be immediate -JOIN_TIMEOUT = 30.0 # seconds - -USE_PROCESS_GROUP = (hasattr(os, "setsid") and hasattr(os, "killpg")) - - -def must_stop(result: TestResult, ns: Namespace) -> bool: - if result.state == State.INTERRUPTED: - return True - if ns.failfast and result.is_failed(ns.fail_env_changed): - return True - return False - - -def parse_worker_args(worker_args) -> tuple[Namespace, str]: - ns_dict, test_name = json.loads(worker_args) - ns = Namespace(**ns_dict) - return (ns, test_name) - - -def run_test_in_subprocess(testname: str, ns: Namespace, tmp_dir: str, stdout_fh: TextIO) -> subprocess.Popen: - ns_dict = vars(ns) - worker_args = (ns_dict, testname) - worker_args = json.dumps(worker_args) - if ns.python is not None: - executable = ns.python - else: - executable = [sys.executable] - cmd = [*executable, *support.args_from_interpreter_flags(), - '-u', # Unbuffered stdout and stderr - '-m', 'test.regrtest', - '--worker-args', worker_args] - - env = dict(os.environ) - if tmp_dir is not None: - env['TMPDIR'] = tmp_dir - env['TEMP'] = tmp_dir - env['TMP'] = tmp_dir - - # Running the child from the same working directory as regrtest's original - # invocation ensures that TEMPDIR for the child is the same when - # sysconfig.is_python_build() is true. See issue 15300. - kw = dict( - env=env, - stdout=stdout_fh, - # bpo-45410: Write stderr into stdout to keep messages order - stderr=stdout_fh, - text=True, - close_fds=(os.name != 'nt'), - cwd=os_helper.SAVEDCWD, - ) - if USE_PROCESS_GROUP: - kw['start_new_session'] = True - return subprocess.Popen(cmd, **kw) - - -def run_tests_worker(ns: Namespace, test_name: str) -> NoReturn: - setup_tests(ns) - - result = runtest(ns, test_name) - - print() # Force a newline (just in case) - - # Serialize TestResult as dict in JSON - print(json.dumps(result, cls=EncodeTestResult), flush=True) - sys.exit(0) - - -# We do not use a generator so multiple threads can call next(). -class MultiprocessIterator: - - """A thread-safe iterator over tests for multiprocess mode.""" - - def __init__(self, tests_iter): - self.lock = threading.Lock() - self.tests_iter = tests_iter - - def __iter__(self): - return self - - def __next__(self): - with self.lock: - if self.tests_iter is None: - raise StopIteration - return next(self.tests_iter) - - def stop(self): - with self.lock: - self.tests_iter = None - - -class MultiprocessResult(NamedTuple): - result: TestResult - # bpo-45410: stderr is written into stdout to keep messages order - worker_stdout: str | None = None - err_msg: str | None = None - - -ExcStr = str -QueueOutput = tuple[Literal[False], MultiprocessResult] | tuple[Literal[True], ExcStr] - - -class ExitThread(Exception): - pass - - -class TestWorkerProcess(threading.Thread): - def __init__(self, worker_id: int, runner: "MultiprocessTestRunner") -> None: - super().__init__() - self.worker_id = worker_id - self.pending = runner.pending - self.output = runner.output - self.ns = runner.ns - self.timeout = runner.worker_timeout - self.regrtest = runner.regrtest - self.current_test_name = None - self.start_time = None - self._popen = None - self._killed = False - self._stopped = False - - def __repr__(self) -> str: - info = [f'TestWorkerProcess #{self.worker_id}'] - if self.is_alive(): - info.append("running") - else: - info.append('stopped') - test = self.current_test_name - if test: - info.append(f'test={test}') - popen = self._popen - if popen is not None: - dt = time.monotonic() - self.start_time - info.extend((f'pid={self._popen.pid}', - f'time={format_duration(dt)}')) - return '<%s>' % ' '.join(info) - - def _kill(self) -> None: - popen = self._popen - if popen is None: - return - - if self._killed: - return - self._killed = True - - if USE_PROCESS_GROUP: - what = f"{self} process group" - else: - what = f"{self}" - - print(f"Kill {what}", file=sys.stderr, flush=True) - try: - if USE_PROCESS_GROUP: - os.killpg(popen.pid, signal.SIGKILL) - else: - popen.kill() - except ProcessLookupError: - # popen.kill(): the process completed, the TestWorkerProcess thread - # read its exit status, but Popen.send_signal() read the returncode - # just before Popen.wait() set returncode. - pass - except OSError as exc: - print_warning(f"Failed to kill {what}: {exc!r}") - - def stop(self) -> None: - # Method called from a different thread to stop this thread - self._stopped = True - self._kill() - - def mp_result_error( - self, - test_result: TestResult, - stdout: str | None = None, - err_msg=None - ) -> MultiprocessResult: - return MultiprocessResult(test_result, stdout, err_msg) - - def _run_process(self, test_name: str, tmp_dir: str, stdout_fh: TextIO) -> int: - self.current_test_name = test_name - try: - popen = run_test_in_subprocess(test_name, self.ns, tmp_dir, stdout_fh) - - self._killed = False - self._popen = popen - except: - self.current_test_name = None - raise - - try: - if self._stopped: - # If kill() has been called before self._popen is set, - # self._popen is still running. Call again kill() - # to ensure that the process is killed. - self._kill() - raise ExitThread - - try: - # gh-94026: stdout+stderr are written to tempfile - retcode = popen.wait(timeout=self.timeout) - assert retcode is not None - return retcode - except subprocess.TimeoutExpired: - if self._stopped: - # kill() has been called: communicate() fails on reading - # closed stdout - raise ExitThread - - # On timeout, kill the process - self._kill() - - # None means TIMEOUT for the caller - retcode = None - # bpo-38207: Don't attempt to call communicate() again: on it - # can hang until all child processes using stdout - # pipes completes. - except OSError: - if self._stopped: - # kill() has been called: communicate() fails - # on reading closed stdout - raise ExitThread - raise - except: - self._kill() - raise - finally: - self._wait_completed() - self._popen = None - self.current_test_name = None - - def _runtest(self, test_name: str) -> MultiprocessResult: - if sys.platform == 'win32': - # gh-95027: When stdout is not a TTY, Python uses the ANSI code - # page for the sys.stdout encoding. If the main process runs in a - # terminal, sys.stdout uses WindowsConsoleIO with UTF-8 encoding. - encoding = locale.getencoding() - else: - encoding = sys.stdout.encoding - - # gh-94026: Write stdout+stderr to a tempfile as workaround for - # non-blocking pipes on Emscripten with NodeJS. - with tempfile.TemporaryFile('w+', encoding=encoding) as stdout_fh: - # gh-93353: Check for leaked temporary files in the parent process, - # since the deletion of temporary files can happen late during - # Python finalization: too late for libregrtest. - if not support.is_wasi: - # Don't check for leaked temporary files and directories if Python is - # run on WASI. WASI don't pass environment variables like TMPDIR to - # worker processes. - tmp_dir = tempfile.mkdtemp(prefix="test_python_") - tmp_dir = os.path.abspath(tmp_dir) - try: - retcode = self._run_process(test_name, tmp_dir, stdout_fh) - finally: - tmp_files = os.listdir(tmp_dir) - os_helper.rmtree(tmp_dir) - else: - retcode = self._run_process(test_name, None, stdout_fh) - tmp_files = () - stdout_fh.seek(0) - - try: - stdout = stdout_fh.read().strip() - except Exception as exc: - # gh-101634: Catch UnicodeDecodeError if stdout cannot be - # decoded from encoding - err_msg = f"Cannot read process stdout: {exc}" - result = TestResult(test_name, state=State.MULTIPROCESSING_ERROR) - return self.mp_result_error(result, err_msg=err_msg) - - if retcode is None: - result = TestResult(test_name, state=State.TIMEOUT) - return self.mp_result_error(result, stdout) - - err_msg = None - if retcode != 0: - err_msg = "Exit code %s" % retcode - else: - stdout, _, worker_json = stdout.rpartition("\n") - stdout = stdout.rstrip() - if not worker_json: - err_msg = "Failed to parse worker stdout" - else: - try: - # deserialize run_tests_worker() output - result = json.loads(worker_json, - object_hook=decode_test_result) - except Exception as exc: - err_msg = "Failed to parse worker JSON: %s" % exc - - if err_msg: - result = TestResult(test_name, state=State.MULTIPROCESSING_ERROR) - return self.mp_result_error(result, stdout, err_msg) - - if tmp_files: - msg = (f'\n\n' - f'Warning -- {test_name} leaked temporary files ' - f'({len(tmp_files)}): {", ".join(sorted(tmp_files))}') - stdout += msg - result.set_env_changed() - - return MultiprocessResult(result, stdout) - - def run(self) -> None: - while not self._stopped: - try: - try: - test_name = next(self.pending) - except StopIteration: - break - - self.start_time = time.monotonic() - mp_result = self._runtest(test_name) - mp_result.result.duration = time.monotonic() - self.start_time - self.output.put((False, mp_result)) - - if must_stop(mp_result.result, self.ns): - break - except ExitThread: - break - except BaseException: - self.output.put((True, traceback.format_exc())) - break - - def _wait_completed(self) -> None: - popen = self._popen - - try: - popen.wait(JOIN_TIMEOUT) - except (subprocess.TimeoutExpired, OSError) as exc: - print_warning(f"Failed to wait for {self} completion " - f"(timeout={format_duration(JOIN_TIMEOUT)}): " - f"{exc!r}") - - def wait_stopped(self, start_time: float) -> None: - # bpo-38207: MultiprocessTestRunner.stop_workers() called self.stop() - # which killed the process. Sometimes, killing the process from the - # main thread does not interrupt popen.communicate() in - # TestWorkerProcess thread. This loop with a timeout is a workaround - # for that. - # - # Moreover, if this method fails to join the thread, it is likely - # that Python will hang at exit while calling threading._shutdown() - # which tries again to join the blocked thread. Regrtest.main() - # uses EXIT_TIMEOUT to workaround this second bug. - while True: - # Write a message every second - self.join(1.0) - if not self.is_alive(): - break - dt = time.monotonic() - start_time - self.regrtest.log(f"Waiting for {self} thread " - f"for {format_duration(dt)}") - if dt > JOIN_TIMEOUT: - print_warning(f"Failed to join {self} in {format_duration(dt)}") - break - - -def get_running(workers: list[TestWorkerProcess]) -> list[TestWorkerProcess]: - running = [] - for worker in workers: - current_test_name = worker.current_test_name - if not current_test_name: - continue - dt = time.monotonic() - worker.start_time - if dt >= PROGRESS_MIN_TIME: - text = '%s (%s)' % (current_test_name, format_duration(dt)) - running.append(text) - return running - - -class MultiprocessTestRunner: - def __init__(self, regrtest: Regrtest) -> None: - self.regrtest = regrtest - self.log = self.regrtest.log - self.ns = regrtest.ns - self.output: queue.Queue[QueueOutput] = queue.Queue() - self.pending = MultiprocessIterator(self.regrtest.tests) - if self.ns.timeout is not None: - # Rely on faulthandler to kill a worker process. This timouet is - # when faulthandler fails to kill a worker process. Give a maximum - # of 5 minutes to faulthandler to kill the worker. - self.worker_timeout = min(self.ns.timeout * 1.5, - self.ns.timeout + 5 * 60) - else: - self.worker_timeout = None - self.workers = None - - def start_workers(self) -> None: - self.workers = [TestWorkerProcess(index, self) - for index in range(1, self.ns.use_mp + 1)] - msg = f"Run tests in parallel using {len(self.workers)} child processes" - if self.ns.timeout: - msg += (" (timeout: %s, worker timeout: %s)" - % (format_duration(self.ns.timeout), - format_duration(self.worker_timeout))) - self.log(msg) - for worker in self.workers: - worker.start() - - def stop_workers(self) -> None: - start_time = time.monotonic() - for worker in self.workers: - worker.stop() - for worker in self.workers: - worker.wait_stopped(start_time) - - def _get_result(self) -> QueueOutput | None: - use_faulthandler = (self.ns.timeout is not None) - timeout = PROGRESS_UPDATE - - # bpo-46205: check the status of workers every iteration to avoid - # waiting forever on an empty queue. - while any(worker.is_alive() for worker in self.workers): - if use_faulthandler: - faulthandler.dump_traceback_later(MAIN_PROCESS_TIMEOUT, - exit=True) - - # wait for a thread - try: - return self.output.get(timeout=timeout) - except queue.Empty: - pass - - # display progress - running = get_running(self.workers) - if running and not self.ns.pgo: - self.log('running: %s' % ', '.join(running)) - - # all worker threads are done: consume pending results - try: - return self.output.get(timeout=0) - except queue.Empty: - return None - - def display_result(self, mp_result: MultiprocessResult) -> None: - result = mp_result.result - - text = str(result) - if mp_result.err_msg: - # MULTIPROCESSING_ERROR - text += ' (%s)' % mp_result.err_msg - elif (result.duration >= PROGRESS_MIN_TIME and not self.ns.pgo): - text += ' (%s)' % format_duration(result.duration) - running = get_running(self.workers) - if running and not self.ns.pgo: - text += ' -- running: %s' % ', '.join(running) - self.regrtest.display_progress(self.test_index, text) - - def _process_result(self, item: QueueOutput) -> bool: - """Returns True if test runner must stop.""" - if item[0]: - # Thread got an exception - format_exc = item[1] - print_warning(f"regrtest worker thread failed: {format_exc}") - result = TestResult("", state=State.MULTIPROCESSING_ERROR) - self.regrtest.accumulate_result(result) - return True - - self.test_index += 1 - mp_result = item[1] - self.regrtest.accumulate_result(mp_result.result) - self.display_result(mp_result) - - if mp_result.worker_stdout: - print(mp_result.worker_stdout, flush=True) - - if must_stop(mp_result.result, self.ns): - return True - - return False - - def run_tests(self) -> None: - self.start_workers() - - self.test_index = 0 - try: - while True: - item = self._get_result() - if item is None: - break - - stop = self._process_result(item) - if stop: - break - except KeyboardInterrupt: - print() - self.regrtest.interrupted = True - finally: - if self.ns.timeout is not None: - faulthandler.cancel_dump_traceback_later() - - # Always ensure that all worker processes are no longer - # worker when we exit this function - self.pending.stop() - self.stop_workers() - - -def run_tests_multiprocess(regrtest: Regrtest) -> None: - MultiprocessTestRunner(regrtest).run_tests() - - -class EncodeTestResult(json.JSONEncoder): - """Encode a TestResult (sub)class object into a JSON dict.""" - - def default(self, o: Any) -> dict[str, Any]: - if isinstance(o, TestResult): - result = dataclasses.asdict(o) - result["__test_result__"] = o.__class__.__name__ - return result - - return super().default(o) - - -def decode_test_result(d: dict[str, Any]) -> TestResult | TestStats | dict[str, Any]: - """Decode a TestResult (sub)class object from a JSON dict.""" - - if "__test_result__" not in d: - return d - - d.pop('__test_result__') - if d['stats'] is not None: - d['stats'] = TestStats(**d['stats']) - return TestResult(**d) diff --git a/Lib/test/libregrtest/runtests.py b/Lib/test/libregrtest/runtests.py new file mode 100644 index 00000000000000..4da312db4cb02e --- /dev/null +++ b/Lib/test/libregrtest/runtests.py @@ -0,0 +1,162 @@ +import contextlib +import dataclasses +import json +import os +import subprocess +from typing import Any + +from test import support + +from .utils import ( + StrPath, StrJSON, TestTuple, FilterTuple, FilterDict) + + +class JsonFileType: + UNIX_FD = "UNIX_FD" + WINDOWS_HANDLE = "WINDOWS_HANDLE" + STDOUT = "STDOUT" + + +@dataclasses.dataclass(slots=True, frozen=True) +class JsonFile: + # file type depends on file_type: + # - UNIX_FD: file descriptor (int) + # - WINDOWS_HANDLE: handle (int) + # - STDOUT: use process stdout (None) + file: int | None + file_type: str + + def configure_subprocess(self, popen_kwargs: dict) -> None: + match self.file_type: + case JsonFileType.UNIX_FD: + # Unix file descriptor + popen_kwargs['pass_fds'] = [self.file] + case JsonFileType.WINDOWS_HANDLE: + # Windows handle + startupinfo = subprocess.STARTUPINFO() + startupinfo.lpAttributeList = {"handle_list": [self.file]} + popen_kwargs['startupinfo'] = startupinfo + + @contextlib.contextmanager + def inherit_subprocess(self): + if self.file_type == JsonFileType.WINDOWS_HANDLE: + os.set_handle_inheritable(self.file, True) + try: + yield + finally: + os.set_handle_inheritable(self.file, False) + else: + yield + + def open(self, mode='r', *, encoding): + if self.file_type == JsonFileType.STDOUT: + raise ValueError("for STDOUT file type, just use sys.stdout") + + file = self.file + if self.file_type == JsonFileType.WINDOWS_HANDLE: + import msvcrt + # Create a file descriptor from the handle + file = msvcrt.open_osfhandle(file, os.O_WRONLY) + return open(file, mode, encoding=encoding) + + +@dataclasses.dataclass(slots=True, frozen=True) +class HuntRefleak: + warmups: int + runs: int + filename: StrPath + + +@dataclasses.dataclass(slots=True, frozen=True) +class RunTests: + tests: TestTuple + fail_fast: bool + fail_env_changed: bool + match_tests: FilterTuple | None + ignore_tests: FilterTuple | None + match_tests_dict: FilterDict | None + rerun: bool + forever: bool + pgo: bool + pgo_extended: bool + output_on_failure: bool + timeout: float | None + verbose: int + quiet: bool + hunt_refleak: HuntRefleak | None + test_dir: StrPath | None + use_junit: bool + memory_limit: str | None + gc_threshold: int | None + use_resources: tuple[str, ...] + python_cmd: tuple[str, ...] | None + randomize: bool + random_seed: int | None + json_file: JsonFile | None + + def copy(self, **override): + state = dataclasses.asdict(self) + state.update(override) + return RunTests(**state) + + def get_match_tests(self, test_name) -> FilterTuple | None: + if self.match_tests_dict is not None: + return self.match_tests_dict.get(test_name, None) + else: + return None + + def get_jobs(self): + # Number of run_single_test() calls needed to run all tests. + # None means that there is not bound limit (--forever option). + if self.forever: + return None + return len(self.tests) + + def iter_tests(self): + if self.forever: + while True: + yield from self.tests + else: + yield from self.tests + + def as_json(self) -> StrJSON: + return json.dumps(self, cls=_EncodeRunTests) + + @staticmethod + def from_json(worker_json: StrJSON) -> 'RunTests': + return json.loads(worker_json, object_hook=_decode_runtests) + + def json_file_use_stdout(self) -> bool: + # Use STDOUT in two cases: + # + # - If --python command line option is used; + # - On Emscripten and WASI. + # + # On other platforms, UNIX_FD or WINDOWS_HANDLE can be used. + return ( + bool(self.python_cmd) + or support.is_emscripten + or support.is_wasi + ) + + +class _EncodeRunTests(json.JSONEncoder): + def default(self, o: Any) -> dict[str, Any]: + if isinstance(o, RunTests): + result = dataclasses.asdict(o) + result["__runtests__"] = True + return result + else: + return super().default(o) + + +def _decode_runtests(data: dict[str, Any]) -> RunTests | dict[str, Any]: + if "__runtests__" in data: + data.pop('__runtests__') + if data['hunt_refleak']: + data['hunt_refleak'] = HuntRefleak(**data['hunt_refleak']) + if data['json_file']: + data['json_file'] = JsonFile(**data['json_file']) + return RunTests(**data) + else: + return data diff --git a/Lib/test/libregrtest/save_env.py b/Lib/test/libregrtest/save_env.py index 7e801a591325e1..b2cc381344b2ef 100644 --- a/Lib/test/libregrtest/save_env.py +++ b/Lib/test/libregrtest/save_env.py @@ -3,9 +3,11 @@ import os import sys import threading + from test import support from test.support import os_helper -from test.libregrtest.utils import print_warning + +from .utils import print_warning class SkipTestEnvironment(Exception): @@ -34,7 +36,7 @@ class saved_test_environment: items is also printed. """ - def __init__(self, test_name, verbose=0, quiet=False, *, pgo=False): + def __init__(self, test_name, verbose, quiet, *, pgo): self.test_name = test_name self.verbose = verbose self.quiet = quiet @@ -161,11 +163,11 @@ def restore_warnings_filters(self, saved_filters): warnings.filters[:] = saved_filters[2] def get_asyncore_socket_map(self): - asyncore = sys.modules.get('asyncore') + asyncore = sys.modules.get('test.support.asyncore') # XXX Making a copy keeps objects alive until __exit__ gets called. return asyncore and asyncore.socket_map.copy() or {} def restore_asyncore_socket_map(self, saved_map): - asyncore = sys.modules.get('asyncore') + asyncore = sys.modules.get('test.support.asyncore') if asyncore is not None: asyncore.close_all(ignore_all=True) asyncore.socket_map.update(saved_map) @@ -257,8 +259,10 @@ def restore_sysconfig__INSTALL_SCHEMES(self, saved): sysconfig._INSTALL_SCHEMES.update(saved[2]) def get_files(self): + # XXX: Maybe add an allow-list here? return sorted(fn + ('/' if os.path.isdir(fn) else '') - for fn in os.listdir()) + for fn in os.listdir() + if not fn.startswith(".hypothesis")) def restore_files(self, saved_value): fn = os_helper.TESTFN if fn not in saved_value and (fn + '/') not in saved_value: diff --git a/Lib/test/libregrtest/setup.py b/Lib/test/libregrtest/setup.py index ecd7fa33107311..793347f60ad93c 100644 --- a/Lib/test/libregrtest/setup.py +++ b/Lib/test/libregrtest/setup.py @@ -1,24 +1,32 @@ -import atexit import faulthandler +import gc import os +import random import signal import sys import unittest from test import support from test.support.os_helper import TESTFN_UNDECODABLE, FS_NONASCII -try: - import gc -except ImportError: - gc = None -from test.libregrtest.utils import (setup_unraisable_hook, - setup_threading_excepthook) +from .runtests import RunTests +from .utils import ( + setup_unraisable_hook, setup_threading_excepthook, fix_umask, + adjust_rlimit_nofile) UNICODE_GUARD_ENV = "PYTHONREGRTEST_UNICODE_GUARD" -def setup_tests(ns): +def setup_test_dir(testdir: str | None) -> None: + if testdir: + # Prepend test directory to sys.path, so runtest() will be able + # to locate tests + sys.path.insert(0, os.path.abspath(testdir)) + + +def setup_process(): + fix_umask() + try: stderr_fd = sys.__stderr__.fileno() except (ValueError, AttributeError): @@ -40,14 +48,9 @@ def setup_tests(ns): for signum in signals: faulthandler.register(signum, chain=True, file=stderr_fd) - _adjust_resource_limits() - replace_stdout() - support.record_original_stdout(sys.stdout) + adjust_rlimit_nofile() - if ns.testdir: - # Prepend test directory to sys.path, so runtest() will be able - # to locate tests - sys.path.insert(0, os.path.abspath(ns.testdir)) + support.record_original_stdout(sys.stdout) # Some times __path__ and __file__ are not absolute (e.g. while running from # Lib/) and, if we change the CWD to run the tests in a temporary dir, some @@ -66,19 +69,6 @@ def setup_tests(ns): if getattr(module, '__file__', None): module.__file__ = os.path.abspath(module.__file__) - if ns.huntrleaks: - unittest.BaseTestSuite._cleanup = False - - if ns.memlimit is not None: - support.set_memlimit(ns.memlimit) - - if ns.threshold is not None: - gc.set_threshold(ns.threshold) - - support.suppress_msvcrt_asserts(ns.verbose and ns.verbose >= 2) - - support.use_resources = ns.use_resources - if hasattr(sys, 'addaudithook'): # Add an auditing hook for all tests to ensure PySys_Audit is tested def _test_audit_hook(name, args): @@ -88,7 +78,37 @@ def _test_audit_hook(name, args): setup_unraisable_hook() setup_threading_excepthook() - timeout = ns.timeout + # Ensure there's a non-ASCII character in env vars at all times to force + # tests consider this case. See BPO-44647 for details. + if TESTFN_UNDECODABLE and os.supports_bytes_environ: + os.environb.setdefault(UNICODE_GUARD_ENV.encode(), TESTFN_UNDECODABLE) + elif FS_NONASCII: + os.environ.setdefault(UNICODE_GUARD_ENV, FS_NONASCII) + + +def setup_tests(runtests: RunTests): + support.verbose = runtests.verbose + support.failfast = runtests.fail_fast + support.PGO = runtests.pgo + support.PGO_EXTENDED = runtests.pgo_extended + + support.set_match_tests(runtests.match_tests, runtests.ignore_tests) + + if runtests.use_junit: + support.junit_xml_list = [] + from test.support.testresult import RegressionTestResult + RegressionTestResult.USE_XML = True + else: + support.junit_xml_list = None + + if runtests.memory_limit is not None: + support.set_memlimit(runtests.memory_limit) + + support.suppress_msvcrt_asserts(runtests.verbose >= 2) + + support.use_resources = runtests.use_resources + + timeout = runtests.timeout if timeout is not None: # For a slow buildbot worker, increase SHORT_TIMEOUT and LONG_TIMEOUT support.LOOPBACK_TIMEOUT = max(support.LOOPBACK_TIMEOUT, timeout / 120) @@ -102,61 +122,10 @@ def _test_audit_hook(name, args): support.SHORT_TIMEOUT = min(support.SHORT_TIMEOUT, timeout) support.LONG_TIMEOUT = min(support.LONG_TIMEOUT, timeout) - if ns.xmlpath: - from test.support.testresult import RegressionTestResult - RegressionTestResult.USE_XML = True - - # Ensure there's a non-ASCII character in env vars at all times to force - # tests consider this case. See BPO-44647 for details. - if TESTFN_UNDECODABLE and os.supports_bytes_environ: - os.environb.setdefault(UNICODE_GUARD_ENV.encode(), TESTFN_UNDECODABLE) - elif FS_NONASCII: - os.environ.setdefault(UNICODE_GUARD_ENV, FS_NONASCII) - - -def replace_stdout(): - """Set stdout encoder error handler to backslashreplace (as stderr error - handler) to avoid UnicodeEncodeError when printing a traceback""" - stdout = sys.stdout - try: - fd = stdout.fileno() - except ValueError: - # On IDLE, sys.stdout has no file descriptor and is not a TextIOWrapper - # object. Leaving sys.stdout unchanged. - # - # Catch ValueError to catch io.UnsupportedOperation on TextIOBase - # and ValueError on a closed stream. - return - - sys.stdout = open(fd, 'w', - encoding=stdout.encoding, - errors="backslashreplace", - closefd=False, - newline='\n') - - def restore_stdout(): - sys.stdout.close() - sys.stdout = stdout - atexit.register(restore_stdout) + if runtests.hunt_refleak: + unittest.BaseTestSuite._cleanup = False + if runtests.gc_threshold is not None: + gc.set_threshold(runtests.gc_threshold) -def _adjust_resource_limits(): - """Adjust the system resource limits (ulimit) if needed.""" - try: - import resource - from resource import RLIMIT_NOFILE, RLIM_INFINITY - except ImportError: - return - fd_limit, max_fds = resource.getrlimit(RLIMIT_NOFILE) - # On macOS the default fd limit is sometimes too low (256) for our - # test suite to succeed. Raise it to something more reasonable. - # 1024 is a common Linux default. - desired_fds = 1024 - if fd_limit < desired_fds and fd_limit < max_fds: - new_fd_limit = min(desired_fds, max_fds) - try: - resource.setrlimit(RLIMIT_NOFILE, (new_fd_limit, max_fds)) - print(f"Raised RLIMIT_NOFILE: {fd_limit} -> {new_fd_limit}") - except (ValueError, OSError) as err: - print(f"Unable to raise RLIMIT_NOFILE from {fd_limit} to " - f"{new_fd_limit}: {err}.") + random.seed(runtests.random_seed) diff --git a/Lib/test/libregrtest/single.py b/Lib/test/libregrtest/single.py new file mode 100644 index 00000000000000..0304f858edf42c --- /dev/null +++ b/Lib/test/libregrtest/single.py @@ -0,0 +1,278 @@ +import doctest +import faulthandler +import gc +import importlib +import io +import sys +import time +import traceback +import unittest + +from test import support +from test.support import TestStats +from test.support import threading_helper + +from .result import State, TestResult +from .runtests import RunTests +from .save_env import saved_test_environment +from .setup import setup_tests +from .utils import ( + TestName, + clear_caches, remove_testfn, abs_module_name, print_warning) + + +# Minimum duration of a test to display its duration or to mention that +# the test is running in background +PROGRESS_MIN_TIME = 30.0 # seconds + + +def run_unittest(test_mod): + loader = unittest.TestLoader() + tests = loader.loadTestsFromModule(test_mod) + for error in loader.errors: + print(error, file=sys.stderr) + if loader.errors: + raise Exception("errors while loading tests") + return support.run_unittest(tests) + + +def regrtest_runner(result: TestResult, test_func, runtests: RunTests) -> None: + # Run test_func(), collect statistics, and detect reference and memory + # leaks. + if runtests.hunt_refleak: + from .refleak import runtest_refleak + refleak, test_result = runtest_refleak(result.test_name, test_func, + runtests.hunt_refleak, + runtests.quiet) + else: + test_result = test_func() + refleak = False + + if refleak: + result.state = State.REFLEAK + + stats: TestStats | None + + match test_result: + case TestStats(): + stats = test_result + case unittest.TestResult(): + stats = TestStats.from_unittest(test_result) + case doctest.TestResults(): + stats = TestStats.from_doctest(test_result) + case None: + print_warning(f"{result.test_name} test runner returned None: {test_func}") + stats = None + case _: + print_warning(f"Unknown test result type: {type(test_result)}") + stats = None + + result.stats = stats + + +# Storage of uncollectable GC objects (gc.garbage) +GC_GARBAGE = [] + + +def _load_run_test(result: TestResult, runtests: RunTests) -> None: + # Load the test module and run the tests. + test_name = result.test_name + module_name = abs_module_name(test_name, runtests.test_dir) + + # Remove the module from sys.module to reload it if it was already imported + sys.modules.pop(module_name, None) + + test_mod = importlib.import_module(module_name) + + if hasattr(test_mod, "test_main"): + # https://github.com/python/cpython/issues/89392 + raise Exception(f"Module {test_name} defines test_main() which " + f"is no longer supported by regrtest") + def test_func(): + return run_unittest(test_mod) + + try: + regrtest_runner(result, test_func, runtests) + finally: + # First kill any dangling references to open files etc. + # This can also issue some ResourceWarnings which would otherwise get + # triggered during the following test run, and possibly produce + # failures. + support.gc_collect() + + remove_testfn(test_name, runtests.verbose) + + if gc.garbage: + support.environment_altered = True + print_warning(f"{test_name} created {len(gc.garbage)} " + f"uncollectable object(s)") + + # move the uncollectable objects somewhere, + # so we don't see them again + GC_GARBAGE.extend(gc.garbage) + gc.garbage.clear() + + support.reap_children() + + +def _runtest_env_changed_exc(result: TestResult, runtests: RunTests, + display_failure: bool = True) -> None: + # Handle exceptions, detect environment changes. + + # Reset the environment_altered flag to detect if a test altered + # the environment + support.environment_altered = False + + pgo = runtests.pgo + if pgo: + display_failure = False + quiet = runtests.quiet + + test_name = result.test_name + try: + clear_caches() + support.gc_collect() + + with saved_test_environment(test_name, + runtests.verbose, quiet, pgo=pgo): + _load_run_test(result, runtests) + except support.ResourceDenied as exc: + if not quiet and not pgo: + print(f"{test_name} skipped -- {exc}", flush=True) + result.state = State.RESOURCE_DENIED + return + except unittest.SkipTest as exc: + if not quiet and not pgo: + print(f"{test_name} skipped -- {exc}", flush=True) + result.state = State.SKIPPED + return + except support.TestFailedWithDetails as exc: + msg = f"test {test_name} failed" + if display_failure: + msg = f"{msg} -- {exc}" + print(msg, file=sys.stderr, flush=True) + result.state = State.FAILED + result.errors = exc.errors + result.failures = exc.failures + result.stats = exc.stats + return + except support.TestFailed as exc: + msg = f"test {test_name} failed" + if display_failure: + msg = f"{msg} -- {exc}" + print(msg, file=sys.stderr, flush=True) + result.state = State.FAILED + result.stats = exc.stats + return + except support.TestDidNotRun: + result.state = State.DID_NOT_RUN + return + except KeyboardInterrupt: + print() + result.state = State.INTERRUPTED + return + except: + if not pgo: + msg = traceback.format_exc() + print(f"test {test_name} crashed -- {msg}", + file=sys.stderr, flush=True) + result.state = State.UNCAUGHT_EXC + return + + if support.environment_altered: + result.set_env_changed() + # Don't override the state if it was already set (REFLEAK or ENV_CHANGED) + if result.state is None: + result.state = State.PASSED + + +def _runtest(result: TestResult, runtests: RunTests) -> None: + # Capture stdout and stderr, set faulthandler timeout, + # and create JUnit XML report. + verbose = runtests.verbose + output_on_failure = runtests.output_on_failure + timeout = runtests.timeout + + use_timeout = ( + timeout is not None and threading_helper.can_start_thread + ) + if use_timeout: + faulthandler.dump_traceback_later(timeout, exit=True) + + try: + setup_tests(runtests) + + if output_on_failure: + support.verbose = True + + stream = io.StringIO() + orig_stdout = sys.stdout + orig_stderr = sys.stderr + print_warning = support.print_warning + orig_print_warnings_stderr = print_warning.orig_stderr + + output = None + try: + sys.stdout = stream + sys.stderr = stream + # print_warning() writes into the temporary stream to preserve + # messages order. If support.environment_altered becomes true, + # warnings will be written to sys.stderr below. + print_warning.orig_stderr = stream + + _runtest_env_changed_exc(result, runtests, display_failure=False) + # Ignore output if the test passed successfully + if result.state != State.PASSED: + output = stream.getvalue() + finally: + sys.stdout = orig_stdout + sys.stderr = orig_stderr + print_warning.orig_stderr = orig_print_warnings_stderr + + if output is not None: + sys.stderr.write(output) + sys.stderr.flush() + else: + # Tell tests to be moderately quiet + support.verbose = verbose + _runtest_env_changed_exc(result, runtests, + display_failure=not verbose) + + xml_list = support.junit_xml_list + if xml_list: + import xml.etree.ElementTree as ET + result.xml_data = [ET.tostring(x).decode('us-ascii') + for x in xml_list] + finally: + if use_timeout: + faulthandler.cancel_dump_traceback_later() + support.junit_xml_list = None + + +def run_single_test(test_name: TestName, runtests: RunTests) -> TestResult: + """Run a single test. + + test_name -- the name of the test + + Returns a TestResult. + + If runtests.use_junit, xml_data is a list containing each generated + testsuite element. + """ + start_time = time.perf_counter() + result = TestResult(test_name) + pgo = runtests.pgo + try: + _runtest(result, runtests) + except: + if not pgo: + msg = traceback.format_exc() + print(f"test {test_name} crashed -- {msg}", + file=sys.stderr, flush=True) + result.state = State.UNCAUGHT_EXC + + sys.stdout.flush() + sys.stderr.flush() + + result.duration = time.perf_counter() - start_time + return result diff --git a/Lib/test/libregrtest/utils.py b/Lib/test/libregrtest/utils.py index 49f3a236f3d14b..b5bbe0ef6596a7 100644 --- a/Lib/test/libregrtest/utils.py +++ b/Lib/test/libregrtest/utils.py @@ -1,9 +1,59 @@ +import contextlib +import faulthandler +import locale import math import os.path +import platform +import random +import shlex +import signal +import subprocess import sys import sysconfig +import tempfile import textwrap +from collections.abc import Callable + from test import support +from test.support import os_helper +from test.support import threading_helper + + +# All temporary files and temporary directories created by libregrtest should +# use TMP_PREFIX so cleanup_temp_dir() can remove them all. +TMP_PREFIX = 'test_python_' +WORK_DIR_PREFIX = TMP_PREFIX +WORKER_WORK_DIR_PREFIX = WORK_DIR_PREFIX + 'worker_' + +# bpo-38203: Maximum delay in seconds to exit Python (call Py_Finalize()). +# Used to protect against threading._shutdown() hang. +# Must be smaller than buildbot "1200 seconds without output" limit. +EXIT_TIMEOUT = 120.0 + + +ALL_RESOURCES = ('audio', 'curses', 'largefile', 'network', + 'decimal', 'cpu', 'subprocess', 'urlfetch', 'gui', 'walltime') + +# Other resources excluded from --use=all: +# +# - extralagefile (ex: test_zipfile64): really too slow to be enabled +# "by default" +# - tzdata: while needed to validate fully test_datetime, it makes +# test_datetime too slow (15-20 min on some buildbots) and so is disabled by +# default (see bpo-30822). +RESOURCE_NAMES = ALL_RESOURCES + ('extralargefile', 'tzdata') + + +# Types for types hints +StrPath = str +TestName = str +StrJSON = str +TestTuple = tuple[TestName, ...] +TestList = list[TestName] +# --match and --ignore options: list of patterns +# ('*' joker character can be used) +FilterTuple = tuple[TestName, ...] +FilterDict = dict[TestName, FilterTuple] def format_duration(seconds): @@ -31,7 +81,7 @@ def format_duration(seconds): return ' '.join(parts) -def removepy(names): +def strip_py_suffix(names: list[str] | None) -> None: if not names: return for idx, name in enumerate(names): @@ -40,11 +90,20 @@ def removepy(names): names[idx] = basename +def plural(n, singular, plural=None): + if n == 1: + return singular + elif plural is not None: + return plural + else: + return singular + 's' + + def count(n, word): if n == 1: - return "%d %s" % (n, word) + return f"{n} {word}" else: - return "%d %ss" % (n, word) + return f"{n} {word}s" def printlist(x, width=70, indent=4, file=None): @@ -125,15 +184,6 @@ def clear_caches(): if stream is not None: stream.flush() - # Clear assorted module caches. - # Don't worry about resetting the cache if the module is not loaded - try: - distutils_dir_util = sys.modules['distutils.dir_util'] - except KeyError: - pass - else: - distutils_dir_util._path_created.clear() - try: re = sys.modules['re'] except KeyError: @@ -212,6 +262,13 @@ def clear_caches(): for f in typing._cleanups: f() + try: + fractions = sys.modules['fractions'] + except KeyError: + pass + else: + fractions._hash_algorithm.cache_clear() + def get_build_info(): # Get most important configure and build options as a list of strings. @@ -292,3 +349,331 @@ def get_build_info(): build.append("dtrace") return build + + +def get_temp_dir(tmp_dir: StrPath | None = None) -> StrPath: + if tmp_dir: + tmp_dir = os.path.expanduser(tmp_dir) + else: + # When tests are run from the Python build directory, it is best practice + # to keep the test files in a subfolder. This eases the cleanup of leftover + # files using the "make distclean" command. + if sysconfig.is_python_build(): + if not support.is_wasi: + tmp_dir = sysconfig.get_config_var('abs_builddir') + if tmp_dir is None: + tmp_dir = sysconfig.get_config_var('abs_srcdir') + if not tmp_dir: + # gh-74470: On Windows, only srcdir is available. Using + # abs_builddir mostly matters on UNIX when building + # Python out of the source tree, especially when the + # source tree is read only. + tmp_dir = sysconfig.get_config_var('srcdir') + tmp_dir = os.path.join(tmp_dir, 'build') + else: + # WASI platform + tmp_dir = sysconfig.get_config_var('projectbase') + tmp_dir = os.path.join(tmp_dir, 'build') + + # When get_temp_dir() is called in a worker process, + # get_temp_dir() path is different than in the parent process + # which is not a WASI process. So the parent does not create + # the same "tmp_dir" than the test worker process. + os.makedirs(tmp_dir, exist_ok=True) + else: + tmp_dir = tempfile.gettempdir() + + return os.path.abspath(tmp_dir) + + +def fix_umask(): + if support.is_emscripten: + # Emscripten has default umask 0o777, which breaks some tests. + # see https://github.com/emscripten-core/emscripten/issues/17269 + old_mask = os.umask(0) + if old_mask == 0o777: + os.umask(0o027) + else: + os.umask(old_mask) + + +def get_work_dir(parent_dir: StrPath, worker: bool = False) -> StrPath: + # Define a writable temp dir that will be used as cwd while running + # the tests. The name of the dir includes the pid to allow parallel + # testing (see the -j option). + # Emscripten and WASI have stubbed getpid(), Emscripten has only + # milisecond clock resolution. Use randint() instead. + if support.is_emscripten or support.is_wasi: + nounce = random.randint(0, 1_000_000) + else: + nounce = os.getpid() + + if worker: + work_dir = WORK_DIR_PREFIX + str(nounce) + else: + work_dir = WORKER_WORK_DIR_PREFIX + str(nounce) + work_dir += os_helper.FS_NONASCII + work_dir = os.path.join(parent_dir, work_dir) + return work_dir + + +@contextlib.contextmanager +def exit_timeout(): + try: + yield + except SystemExit as exc: + # bpo-38203: Python can hang at exit in Py_Finalize(), especially + # on threading._shutdown() call: put a timeout + if threading_helper.can_start_thread: + faulthandler.dump_traceback_later(EXIT_TIMEOUT, exit=True) + sys.exit(exc.code) + + +def remove_testfn(test_name: TestName, verbose: int) -> None: + # Try to clean up os_helper.TESTFN if left behind. + # + # While tests shouldn't leave any files or directories behind, when a test + # fails that can be tedious for it to arrange. The consequences can be + # especially nasty on Windows, since if a test leaves a file open, it + # cannot be deleted by name (while there's nothing we can do about that + # here either, we can display the name of the offending test, which is a + # real help). + name = os_helper.TESTFN + if not os.path.exists(name): + return + + nuker: Callable[[str], None] + if os.path.isdir(name): + import shutil + kind, nuker = "directory", shutil.rmtree + elif os.path.isfile(name): + kind, nuker = "file", os.unlink + else: + raise RuntimeError(f"os.path says {name!r} exists but is neither " + f"directory nor file") + + if verbose: + print_warning(f"{test_name} left behind {kind} {name!r}") + support.environment_altered = True + + try: + import stat + # fix possible permissions problems that might prevent cleanup + os.chmod(name, stat.S_IRWXU | stat.S_IRWXG | stat.S_IRWXO) + nuker(name) + except Exception as exc: + print_warning(f"{test_name} left behind {kind} {name!r} " + f"and it couldn't be removed: {exc}") + + +def abs_module_name(test_name: TestName, test_dir: StrPath | None) -> TestName: + if test_name.startswith('test.') or test_dir: + return test_name + else: + # Import it from the test package + return 'test.' + test_name + + +# gh-90681: When rerunning tests, we might need to rerun the whole +# class or module suite if some its life-cycle hooks fail. +# Test level hooks are not affected. +_TEST_LIFECYCLE_HOOKS = frozenset(( + 'setUpClass', 'tearDownClass', + 'setUpModule', 'tearDownModule', +)) + +def normalize_test_name(test_full_name, *, is_error=False): + short_name = test_full_name.split(" ")[0] + if is_error and short_name in _TEST_LIFECYCLE_HOOKS: + if test_full_name.startswith(('setUpModule (', 'tearDownModule (')): + # if setUpModule() or tearDownModule() failed, don't filter + # tests with the test file name, don't use use filters. + return None + + # This means that we have a failure in a life-cycle hook, + # we need to rerun the whole module or class suite. + # Basically the error looks like this: + # ERROR: setUpClass (test.test_reg_ex.RegTest) + # or + # ERROR: setUpModule (test.test_reg_ex) + # So, we need to parse the class / module name. + lpar = test_full_name.index('(') + rpar = test_full_name.index(')') + return test_full_name[lpar + 1: rpar].split('.')[-1] + return short_name + + +def adjust_rlimit_nofile(): + """ + On macOS the default fd limit (RLIMIT_NOFILE) is sometimes too low (256) + for our test suite to succeed. Raise it to something more reasonable. 1024 + is a common Linux default. + """ + try: + import resource + except ImportError: + return + + fd_limit, max_fds = resource.getrlimit(resource.RLIMIT_NOFILE) + + desired_fds = 1024 + + if fd_limit < desired_fds and fd_limit < max_fds: + new_fd_limit = min(desired_fds, max_fds) + try: + resource.setrlimit(resource.RLIMIT_NOFILE, + (new_fd_limit, max_fds)) + print(f"Raised RLIMIT_NOFILE: {fd_limit} -> {new_fd_limit}") + except (ValueError, OSError) as err: + print_warning(f"Unable to raise RLIMIT_NOFILE from {fd_limit} to " + f"{new_fd_limit}: {err}.") + + +def get_host_runner(): + if (hostrunner := os.environ.get("_PYTHON_HOSTRUNNER")) is None: + hostrunner = sysconfig.get_config_var("HOSTRUNNER") + return hostrunner + + +def is_cross_compiled(): + return ('_PYTHON_HOST_PLATFORM' in os.environ) + + +def format_resources(use_resources: tuple[str, ...]): + use_resources = set(use_resources) + all_resources = set(ALL_RESOURCES) + + # Express resources relative to "all" + relative_all = ['all'] + for name in sorted(all_resources - use_resources): + relative_all.append(f'-{name}') + for name in sorted(use_resources - all_resources): + relative_all.append(f'{name}') + all_text = ','.join(relative_all) + all_text = f"resources: {all_text}" + + # List of enabled resources + text = ','.join(sorted(use_resources)) + text = f"resources ({len(use_resources)}): {text}" + + # Pick the shortest string (prefer relative to all if lengths are equal) + if len(all_text) <= len(text): + return all_text + else: + return text + + +def process_cpu_count(): + if hasattr(os, 'sched_getaffinity'): + return len(os.sched_getaffinity(0)) + else: + return os.cpu_count() + + +def display_header(use_resources: tuple[str, ...], + python_cmd: tuple[str, ...] | None): + # Print basic platform information + print("==", platform.python_implementation(), *sys.version.split()) + print("==", platform.platform(aliased=True), + "%s-endian" % sys.byteorder) + print("== Python build:", ' '.join(get_build_info())) + print("== cwd:", os.getcwd()) + + cpu_count = os.cpu_count() + if cpu_count: + affinity = process_cpu_count() + if affinity and affinity != cpu_count: + cpu_count = f"{affinity} (process) / {cpu_count} (system)" + print("== CPU count:", cpu_count) + print("== encodings: locale=%s FS=%s" + % (locale.getencoding(), sys.getfilesystemencoding())) + + if use_resources: + text = format_resources(use_resources) + print(f"== {text}") + else: + print("== resources: all test resources are disabled, " + "use -u option to unskip tests") + + cross_compile = is_cross_compiled() + if cross_compile: + print("== cross compiled: Yes") + if python_cmd: + cmd = shlex.join(python_cmd) + print(f"== host python: {cmd}") + + get_cmd = [*python_cmd, '-m', 'platform'] + proc = subprocess.run( + get_cmd, + stdout=subprocess.PIPE, + text=True, + cwd=os_helper.SAVEDCWD) + stdout = proc.stdout.replace('\n', ' ').strip() + if stdout: + print(f"== host platform: {stdout}") + elif proc.returncode: + print(f"== host platform: ") + else: + hostrunner = get_host_runner() + if hostrunner: + print(f"== host runner: {hostrunner}") + + # This makes it easier to remember what to set in your local + # environment when trying to reproduce a sanitizer failure. + asan = support.check_sanitizer(address=True) + msan = support.check_sanitizer(memory=True) + ubsan = support.check_sanitizer(ub=True) + sanitizers = [] + if asan: + sanitizers.append("address") + if msan: + sanitizers.append("memory") + if ubsan: + sanitizers.append("undefined behavior") + if sanitizers: + print(f"== sanitizers: {', '.join(sanitizers)}") + for sanitizer, env_var in ( + (asan, "ASAN_OPTIONS"), + (msan, "MSAN_OPTIONS"), + (ubsan, "UBSAN_OPTIONS"), + ): + options= os.environ.get(env_var) + if sanitizer and options is not None: + print(f"== {env_var}={options!r}") + + print(flush=True) + + +def cleanup_temp_dir(tmp_dir: StrPath): + import glob + + path = os.path.join(glob.escape(tmp_dir), TMP_PREFIX + '*') + print("Cleanup %s directory" % tmp_dir) + for name in glob.glob(path): + if os.path.isdir(name): + print("Remove directory: %s" % name) + os_helper.rmtree(name) + else: + print("Remove file: %s" % name) + os_helper.unlink(name) + +WINDOWS_STATUS = { + 0xC0000005: "STATUS_ACCESS_VIOLATION", + 0xC00000FD: "STATUS_STACK_OVERFLOW", + 0xC000013A: "STATUS_CONTROL_C_EXIT", +} + +def get_signal_name(exitcode): + if exitcode < 0: + signum = -exitcode + try: + return signal.Signals(signum).name + except ValueError: + pass + + try: + return WINDOWS_STATUS[exitcode] + except KeyError: + pass + + return None diff --git a/Lib/test/libregrtest/worker.py b/Lib/test/libregrtest/worker.py new file mode 100644 index 00000000000000..a9c8be0bb65d08 --- /dev/null +++ b/Lib/test/libregrtest/worker.py @@ -0,0 +1,116 @@ +import subprocess +import sys +import os +from typing import Any, NoReturn + +from test import support +from test.support import os_helper + +from .setup import setup_process, setup_test_dir +from .runtests import RunTests, JsonFile, JsonFileType +from .single import run_single_test +from .utils import ( + StrPath, StrJSON, FilterTuple, + get_temp_dir, get_work_dir, exit_timeout) + + +USE_PROCESS_GROUP = (hasattr(os, "setsid") and hasattr(os, "killpg")) + + +def create_worker_process(runtests: RunTests, output_fd: int, + tmp_dir: StrPath | None = None) -> subprocess.Popen: + python_cmd = runtests.python_cmd + worker_json = runtests.as_json() + + python_opts = support.args_from_interpreter_flags() + if python_cmd is not None: + executable = python_cmd + # Remove -E option, since --python=COMMAND can set PYTHON environment + # variables, such as PYTHONPATH, in the worker process. + python_opts = [opt for opt in python_opts if opt != "-E"] + else: + executable = (sys.executable,) + cmd = [*executable, *python_opts, + '-u', # Unbuffered stdout and stderr + '-m', 'test.libregrtest.worker', + worker_json] + + env = dict(os.environ) + if tmp_dir is not None: + env['TMPDIR'] = tmp_dir + env['TEMP'] = tmp_dir + env['TMP'] = tmp_dir + + # Running the child from the same working directory as regrtest's original + # invocation ensures that TEMPDIR for the child is the same when + # sysconfig.is_python_build() is true. See issue 15300. + # + # Emscripten and WASI Python must start in the Python source code directory + # to get 'python.js' or 'python.wasm' file. Then worker_process() changes + # to a temporary directory created to run tests. + work_dir = os_helper.SAVEDCWD + + kwargs: dict[str, Any] = dict( + env=env, + stdout=output_fd, + # bpo-45410: Write stderr into stdout to keep messages order + stderr=output_fd, + text=True, + close_fds=True, + cwd=work_dir, + ) + if USE_PROCESS_GROUP: + kwargs['start_new_session'] = True + + # Pass json_file to the worker process + json_file = runtests.json_file + json_file.configure_subprocess(kwargs) + + with json_file.inherit_subprocess(): + return subprocess.Popen(cmd, **kwargs) + + +def worker_process(worker_json: StrJSON) -> NoReturn: + runtests = RunTests.from_json(worker_json) + test_name = runtests.tests[0] + match_tests: FilterTuple | None = runtests.match_tests + json_file: JsonFile = runtests.json_file + + setup_test_dir(runtests.test_dir) + setup_process() + + if runtests.rerun: + if match_tests: + matching = "matching: " + ", ".join(match_tests) + print(f"Re-running {test_name} in verbose mode ({matching})", flush=True) + else: + print(f"Re-running {test_name} in verbose mode", flush=True) + + result = run_single_test(test_name, runtests) + + if json_file.file_type == JsonFileType.STDOUT: + print() + result.write_json_into(sys.stdout) + else: + with json_file.open('w', encoding='utf-8') as json_fp: + result.write_json_into(json_fp) + + sys.exit(0) + + +def main(): + if len(sys.argv) != 2: + print("usage: python -m test.libregrtest.worker JSON") + sys.exit(1) + worker_json = sys.argv[1] + + tmp_dir = get_temp_dir() + work_dir = get_work_dir(tmp_dir, worker=True) + + with exit_timeout(): + with os_helper.temp_cwd(work_dir, quiet=True): + worker_process(worker_json) + + +if __name__ == "__main__": + main() diff --git a/Lib/test/pythoninfo.py b/Lib/test/pythoninfo.py index 3dbcac2f86e81e..4f3ebb12ed957d 100644 --- a/Lib/test/pythoninfo.py +++ b/Lib/test/pythoninfo.py @@ -1,18 +1,13 @@ """ Collect various information about Python to help debugging test failures. """ -from __future__ import print_function import errno import re import sys import traceback -import unittest import warnings -MS_WINDOWS = (sys.platform == 'win32') - - def normalize_text(text): if text is None: return None @@ -244,6 +239,7 @@ def format_attr(attr, value): 'getresgid', 'getresuid', 'getuid', + 'process_cpu_count', 'uname', ): call_func(info_add, 'os.%s' % func, os, func) @@ -273,6 +269,7 @@ def format_groups(groups): "ARCHFLAGS", "ARFLAGS", "AUDIODEV", + "BUILDPYTHON", "CC", "CFLAGS", "COLUMNS", @@ -317,7 +314,6 @@ def format_groups(groups): "TEMP", "TERM", "TILE_LIBRARY", - "TIX_LIBRARY", "TMP", "TMPDIR", "TRAVIS", @@ -326,6 +322,7 @@ def format_groups(groups): "VIRTUAL_ENV", "WAYLAND_DISPLAY", "WINDIR", + "_PYTHON_HOSTRUNNER", "_PYTHON_HOST_PLATFORM", "_PYTHON_PROJECT_BASE", "_PYTHON_SYSCONFIGDATA_NAME", @@ -341,7 +338,8 @@ def format_groups(groups): for name, value in os.environ.items(): uname = name.upper() if (uname in ENV_VARS - # Copy PYTHON* and LC_* variables + # Copy PYTHON* variables like PYTHONPATH + # Copy LC_* variables like LC_ALL or uname.startswith(("PYTHON", "LC_")) # Visual Studio: VS140COMNTOOLS or (uname.startswith("VS") and uname.endswith("COMNTOOLS"))): @@ -494,13 +492,10 @@ def collect_datetime(info_add): def collect_sysconfig(info_add): - # On Windows, sysconfig is not reliable to get macros used - # to build Python - if MS_WINDOWS: - return - import sysconfig + info_add('sysconfig.is_python_build', sysconfig.is_python_build()) + for name in ( 'ABIFLAGS', 'ANDROID_API_LEVEL', @@ -509,6 +504,7 @@ def collect_sysconfig(info_add): 'CFLAGS', 'CFLAGSFORSHARED', 'CONFIG_ARGS', + 'HOSTRUNNER', 'HOST_GNU_TYPE', 'MACHDEP', 'MULTIARCH', @@ -634,7 +630,7 @@ def collect_sqlite(info_add): except ImportError: return - attributes = ('version', 'sqlite_version') + attributes = ('sqlite_version',) copy_attributes(info_add, sqlite3, 'sqlite3.%s', attributes) @@ -674,7 +670,29 @@ def collect_testcapi(info_add): except ImportError: return - call_func(info_add, 'pymem.allocator', _testcapi, 'pymem_getallocatorsname') + for name in ( + 'LONG_MAX', # always 32-bit on Windows, 64-bit on 64-bit Unix + 'PY_SSIZE_T_MAX', + 'Py_C_RECURSION_LIMIT', + 'SIZEOF_TIME_T', # 32-bit or 64-bit depending on the platform + 'SIZEOF_WCHAR_T', # 16-bit or 32-bit depending on the platform + ): + copy_attr(info_add, f'_testcapi.{name}', _testcapi, name) + + +def collect_testinternalcapi(info_add): + try: + import _testinternalcapi + except ImportError: + return + + call_func(info_add, 'pymem.allocator', _testinternalcapi, 'pymem_getallocatorsname') + + for name in ( + 'SIZEOF_PYGC_HEAD', + 'SIZEOF_PYOBJECT', + ): + copy_attr(info_add, f'_testinternalcapi.{name}', _testinternalcapi, name) def collect_resource(info_add): @@ -693,6 +711,7 @@ def collect_resource(info_add): def collect_test_socket(info_add): + import unittest try: from test import test_socket except (ImportError, unittest.SkipTest): @@ -704,15 +723,12 @@ def collect_test_socket(info_add): copy_attributes(info_add, test_socket, 'test_socket.%s', attributes) -def collect_test_support(info_add): +def collect_support(info_add): try: from test import support except ImportError: return - attributes = ('IPV6_ENABLED',) - copy_attributes(info_add, support, 'test_support.%s', attributes) - attributes = ( 'MS_WINDOWS', 'has_fork_support', @@ -726,17 +742,64 @@ def collect_test_support(info_add): ) copy_attributes(info_add, support, 'support.%s', attributes) - call_func(info_add, 'test_support._is_gui_available', support, '_is_gui_available') - call_func(info_add, 'test_support.python_is_optimized', support, 'python_is_optimized') + call_func(info_add, 'support._is_gui_available', support, '_is_gui_available') + call_func(info_add, 'support.python_is_optimized', support, 'python_is_optimized') - info_add('test_support.check_sanitizer(address=True)', + info_add('support.check_sanitizer(address=True)', support.check_sanitizer(address=True)) - info_add('test_support.check_sanitizer(memory=True)', + info_add('support.check_sanitizer(memory=True)', support.check_sanitizer(memory=True)) - info_add('test_support.check_sanitizer(ub=True)', + info_add('support.check_sanitizer(ub=True)', support.check_sanitizer(ub=True)) +def collect_support_os_helper(info_add): + try: + from test.support import os_helper + except ImportError: + return + + for name in ( + 'can_symlink', + 'can_xattr', + 'can_chmod', + 'can_dac_override', + ): + func = getattr(os_helper, name) + info_add(f'support_os_helper.{name}', func()) + + +def collect_support_socket_helper(info_add): + try: + from test.support import socket_helper + except ImportError: + return + + attributes = ( + 'IPV6_ENABLED', + 'has_gethostname', + ) + copy_attributes(info_add, socket_helper, 'support_socket_helper.%s', attributes) + + for name in ( + 'tcp_blackhole', + ): + func = getattr(socket_helper, name) + info_add(f'support_socket_helper.{name}', func()) + + +def collect_support_threading_helper(info_add): + try: + from test.support import threading_helper + except ImportError: + return + + attributes = ( + 'can_start_thread', + ) + copy_attributes(info_add, threading_helper, 'support_threading_helper.%s', attributes) + + def collect_cc(info_add): import subprocess import sysconfig @@ -891,6 +954,12 @@ def collect_fips(info_add): pass +def collect_tempfile(info_add): + import tempfile + + info_add('tempfile.gettempdir', tempfile.gettempdir()) + + def collect_libregrtest_utils(info_add): try: from test.libregrtest import utils @@ -933,6 +1002,8 @@ def collect_info(info): collect_sys, collect_sysconfig, collect_testcapi, + collect_testinternalcapi, + collect_tempfile, collect_time, collect_tkinter, collect_windows, @@ -941,7 +1012,10 @@ def collect_info(info): # Collecting from tests should be last as they have side effects. collect_test_socket, - collect_test_support, + collect_support, + collect_support_os_helper, + collect_support_socket_helper, + collect_support_threading_helper, ): try: collect_func(info_add) diff --git a/Lib/test/regrtest.py b/Lib/test/regrtest.py index 0ffb3ed454eda0..46a74fe276f553 100755 --- a/Lib/test/regrtest.py +++ b/Lib/test/regrtest.py @@ -8,7 +8,7 @@ import os import sys -from test.libregrtest import main +from test.libregrtest.main import main # Alias for backward compatibility (just in case) diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index f5307b6442bf66..9066f0cd5b3819 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -431,6 +431,14 @@ def skip_if_sanitizer(reason=None, *, address=False, memory=False, ub=False): HAVE_ASAN_FORK_BUG = check_sanitizer(address=True) +def set_sanitizer_env_var(env, option): + for name in ('ASAN_OPTIONS', 'MSAN_OPTIONS', 'UBSAN_OPTIONS'): + if name in env: + env[name] += f':{option}' + else: + env[name] = option + + def system_must_validate_cert(f): """Skip the test on TLS certificate validation failures.""" @functools.wraps(f) @@ -892,27 +900,31 @@ def inner(*args, **kwds): MAX_Py_ssize_t = sys.maxsize -def set_memlimit(limit): - global max_memuse - global real_max_memuse +def _parse_memlimit(limit: str) -> int: sizes = { 'k': 1024, 'm': _1M, 'g': _1G, 't': 1024*_1G, } - m = re.match(r'(\d+(\.\d+)?) (K|M|G|T)b?$', limit, + m = re.match(r'(\d+(?:\.\d+)?) (K|M|G|T)b?$', limit, re.IGNORECASE | re.VERBOSE) if m is None: - raise ValueError('Invalid memory limit %r' % (limit,)) - memlimit = int(float(m.group(1)) * sizes[m.group(3).lower()]) - real_max_memuse = memlimit - if memlimit > MAX_Py_ssize_t: - memlimit = MAX_Py_ssize_t + raise ValueError(f'Invalid memory limit: {limit!r}') + return int(float(m.group(1)) * sizes[m.group(2).lower()]) + +def set_memlimit(limit: str) -> None: + global max_memuse + global real_max_memuse + memlimit = _parse_memlimit(limit) if memlimit < _2G - 1: - raise ValueError('Memory limit %r too low to be useful' % (limit,)) + raise ValueError('Memory limit {limit!r} too low to be useful') + + real_max_memuse = memlimit + memlimit = min(memlimit, MAX_Py_ssize_t) max_memuse = memlimit + class _MemoryWatchdog: """An object which periodically watches the process' memory consumption and prints it out. @@ -1187,7 +1199,6 @@ def _is_full_match_test(pattern): def set_match_tests(accept_patterns=None, ignore_patterns=None): global _match_test_func, _accept_test_patterns, _ignore_test_patterns - if accept_patterns is None: accept_patterns = () if ignore_patterns is None: @@ -2139,31 +2150,26 @@ def wait_process(pid, *, exitcode, timeout=None): if timeout is None: timeout = LONG_TIMEOUT - t0 = time.monotonic() - sleep = 0.001 - max_sleep = 0.1 - while True: + + start_time = time.monotonic() + for _ in sleeping_retry(timeout, error=False): pid2, status = os.waitpid(pid, os.WNOHANG) if pid2 != 0: break - # process is still running - - dt = time.monotonic() - t0 - if dt > timeout: - try: - os.kill(pid, signal.SIGKILL) - os.waitpid(pid, 0) - except OSError: - # Ignore errors like ChildProcessError or PermissionError - pass - - raise AssertionError(f"process {pid} is still running " - f"after {dt:.1f} seconds") + # rety: the process is still running + else: + try: + os.kill(pid, signal.SIGKILL) + os.waitpid(pid, 0) + except OSError: + # Ignore errors like ChildProcessError or PermissionError + pass - sleep = min(sleep * 2, max_sleep) - time.sleep(sleep) + dt = time.monotonic() - start_time + raise AssertionError(f"process {pid} is still running " + f"after {dt:.1f} seconds") else: - # Windows implementation + # Windows implementation: don't support timeout :-( pid2, status = os.waitpid(pid, 0) exitcode2 = os.waitstatus_to_exitcode(status) @@ -2327,6 +2333,11 @@ def requires_venv_with_pip(): return unittest.skipUnless(ctypes, 'venv: pip requires ctypes') +# True if Python is built with the Py_DEBUG macro defined: if +# Python is built in debug mode (./configure --with-pydebug). +Py_DEBUG = hasattr(sys, 'gettotalrefcount') + + def busy_retry(timeout, err_msg=None, /, *, error=True): """ Run the loop body until "break" stops the loop. diff --git a/Lib/test/support/os_helper.py b/Lib/test/support/os_helper.py index f599cc752196e1..f77cce89a37d57 100644 --- a/Lib/test/support/os_helper.py +++ b/Lib/test/support/os_helper.py @@ -567,7 +567,7 @@ def fs_is_case_insensitive(directory): class FakePath: - """Simple implementing of the path protocol. + """Simple implementation of the path protocol. """ def __init__(self, path): self.path = path diff --git a/Lib/test/support/testresult.py b/Lib/test/support/testresult.py index 2cd1366cd8a9e1..c5c297f1057b17 100644 --- a/Lib/test/support/testresult.py +++ b/Lib/test/support/testresult.py @@ -8,6 +8,7 @@ import time import traceback import unittest +from test import support class RegressionTestResult(unittest.TextTestResult): USE_XML = False @@ -109,6 +110,8 @@ def addExpectedFailure(self, test, err): def addFailure(self, test, err): self._add_result(test, True, failure=self.__makeErrorDict(*err)) super().addFailure(test, err) + if support.failfast: + self.stop() def addSkip(self, test, reason): self._add_result(test, skipped=reason) diff --git a/Lib/test/support/threading_helper.py b/Lib/test/support/threading_helper.py index 26cbc6f4d2439c..b9973c8bf5c914 100644 --- a/Lib/test/support/threading_helper.py +++ b/Lib/test/support/threading_helper.py @@ -88,19 +88,17 @@ def wait_threads_exit(timeout=None): yield finally: start_time = time.monotonic() - deadline = start_time + timeout - while True: + for _ in support.sleeping_retry(timeout, error=False): + support.gc_collect() count = _thread._count() if count <= old_count: break - if time.monotonic() > deadline: - dt = time.monotonic() - start_time - msg = (f"wait_threads() failed to cleanup {count - old_count} " - f"threads after {dt:.1f} seconds " - f"(count: {count}, old count: {old_count})") - raise AssertionError(msg) - time.sleep(0.010) - support.gc_collect() + else: + dt = time.monotonic() - start_time + msg = (f"wait_threads() failed to cleanup {count - old_count} " + f"threads after {dt:.1f} seconds " + f"(count: {count}, old count: {old_count})") + raise AssertionError(msg) def join_thread(thread, timeout=None): diff --git a/Lib/test/test_faulthandler.py b/Lib/test/test_faulthandler.py index 50b40d336e4dca..2d190647f2834b 100644 --- a/Lib/test/test_faulthandler.py +++ b/Lib/test/test_faulthandler.py @@ -8,7 +8,6 @@ import sys from test import support from test.support import os_helper, script_helper, is_android, MS_WINDOWS -from test.support import skip_if_sanitizer import tempfile import unittest from textwrap import dedent @@ -34,7 +33,7 @@ def expected_traceback(lineno1, lineno2, header, min_count=1): return '^' + regex + '$' def skip_segfault_on_android(test): - # Issue #32138: Raising SIGSEGV on Android may not cause a crash. + # gh-76319: Raising SIGSEGV on Android may not cause a crash. return unittest.skipIf(is_android, 'raising SIGSEGV on Android is unreliable')(test) @@ -62,8 +61,16 @@ def get_output(self, code, filename=None, fd=None): pass_fds = [] if fd is not None: pass_fds.append(fd) + env = dict(os.environ) + + # Sanitizers must not handle SIGSEGV (ex: for test_enable_fd()) + option = 'handle_segv=0' + support.set_sanitizer_env_var(env, option) + with support.SuppressCrashReport(): - process = script_helper.spawn_python('-c', code, pass_fds=pass_fds) + process = script_helper.spawn_python('-c', code, + pass_fds=pass_fds, + env=env) with process: output, stderr = process.communicate() exitcode = process.wait() @@ -302,8 +309,6 @@ def test_gil_released(self): 3, 'Segmentation fault') - @skip_if_sanitizer(memory=True, ub=True, reason="sanitizer " - "builds change crashing process output.") @skip_segfault_on_android def test_enable_file(self): with temporary_filename() as filename: @@ -319,8 +324,6 @@ def test_enable_file(self): @unittest.skipIf(sys.platform == "win32", "subprocess doesn't support pass_fds on Windows") - @skip_if_sanitizer(memory=True, ub=True, reason="sanitizer " - "builds change crashing process output.") @skip_segfault_on_android def test_enable_fd(self): with tempfile.TemporaryFile('wb+') as fp: diff --git a/Lib/test/test_regrtest.py b/Lib/test/test_regrtest.py index 88c59b79dd5a91..45573a2719c543 100644 --- a/Lib/test/test_regrtest.py +++ b/Lib/test/test_regrtest.py @@ -5,28 +5,33 @@ """ import contextlib +import dataclasses import glob import io import locale import os.path import platform +import random import re +import shlex +import signal import subprocess import sys import sysconfig import tempfile import textwrap -import time import unittest -from test import libregrtest from test import support from test.support import os_helper, TestStats -from test.libregrtest import utils, setup +from test.libregrtest import cmdline +from test.libregrtest import main +from test.libregrtest import setup +from test.libregrtest import utils +from test.libregrtest.utils import normalize_test_name if not support.has_subprocess_support: raise unittest.SkipTest("test module requires subprocess") -Py_DEBUG = hasattr(sys, 'gettotalrefcount') ROOT_DIR = os.path.join(os.path.dirname(__file__), '..', '..') ROOT_DIR = os.path.abspath(os.path.normpath(ROOT_DIR)) LOG_PREFIX = r'[0-9]+:[0-9]+:[0-9]+ (?:load avg: [0-9]+\.[0-9]{2} )?' @@ -34,6 +39,7 @@ EXITCODE_BAD_TEST = 2 EXITCODE_ENV_CHANGED = 3 EXITCODE_NO_TESTS_RAN = 4 +EXITCODE_RERUN_FAIL = 5 EXITCODE_INTERRUPTED = 130 TEST_INTERRUPTED = textwrap.dedent(""" @@ -51,9 +57,13 @@ class ParseArgsTestCase(unittest.TestCase): Test regrtest's argument parsing, function _parse_args(). """ + @staticmethod + def parse_args(args): + return cmdline._parse_args(args) + def checkError(self, args, msg): with support.captured_stderr() as err, self.assertRaises(SystemExit): - libregrtest._parse_args(args) + self.parse_args(args) self.assertIn(msg, err.getvalue()) def test_help(self): @@ -61,83 +71,93 @@ def test_help(self): with self.subTest(opt=opt): with support.captured_stdout() as out, \ self.assertRaises(SystemExit): - libregrtest._parse_args([opt]) + self.parse_args([opt]) self.assertIn('Run Python regression tests.', out.getvalue()) def test_timeout(self): - ns = libregrtest._parse_args(['--timeout', '4.2']) + ns = self.parse_args(['--timeout', '4.2']) self.assertEqual(ns.timeout, 4.2) + + # negative, zero and empty string are treated as "no timeout" + for value in ('-1', '0', ''): + with self.subTest(value=value): + ns = self.parse_args([f'--timeout={value}']) + self.assertEqual(ns.timeout, None) + self.checkError(['--timeout'], 'expected one argument') - self.checkError(['--timeout', 'foo'], 'invalid float value') + self.checkError(['--timeout', 'foo'], 'invalid timeout value:') def test_wait(self): - ns = libregrtest._parse_args(['--wait']) + ns = self.parse_args(['--wait']) self.assertTrue(ns.wait) - def test_worker_args(self): - ns = libregrtest._parse_args(['--worker-args', '[[], {}]']) - self.assertEqual(ns.worker_args, '[[], {}]') - self.checkError(['--worker-args'], 'expected one argument') - def test_start(self): for opt in '-S', '--start': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt, 'foo']) + ns = self.parse_args([opt, 'foo']) self.assertEqual(ns.start, 'foo') self.checkError([opt], 'expected one argument') def test_verbose(self): - ns = libregrtest._parse_args(['-v']) + ns = self.parse_args(['-v']) self.assertEqual(ns.verbose, 1) - ns = libregrtest._parse_args(['-vvv']) + ns = self.parse_args(['-vvv']) self.assertEqual(ns.verbose, 3) - ns = libregrtest._parse_args(['--verbose']) + ns = self.parse_args(['--verbose']) self.assertEqual(ns.verbose, 1) - ns = libregrtest._parse_args(['--verbose'] * 3) + ns = self.parse_args(['--verbose'] * 3) self.assertEqual(ns.verbose, 3) - ns = libregrtest._parse_args([]) + ns = self.parse_args([]) self.assertEqual(ns.verbose, 0) - def test_verbose2(self): - for opt in '-w', '--verbose2': + def test_rerun(self): + for opt in '-w', '--rerun', '--verbose2': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt]) - self.assertTrue(ns.verbose2) + ns = self.parse_args([opt]) + self.assertTrue(ns.rerun) def test_verbose3(self): for opt in '-W', '--verbose3': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt]) + ns = self.parse_args([opt]) self.assertTrue(ns.verbose3) def test_quiet(self): for opt in '-q', '--quiet': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt]) + ns = self.parse_args([opt]) self.assertTrue(ns.quiet) self.assertEqual(ns.verbose, 0) def test_slowest(self): for opt in '-o', '--slowest': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt]) + ns = self.parse_args([opt]) self.assertTrue(ns.print_slow) def test_header(self): - ns = libregrtest._parse_args(['--header']) + ns = self.parse_args(['--header']) self.assertTrue(ns.header) - ns = libregrtest._parse_args(['--verbose']) + ns = self.parse_args(['--verbose']) self.assertTrue(ns.header) def test_randomize(self): for opt in '-r', '--randomize': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt]) + ns = self.parse_args([opt]) self.assertTrue(ns.randomize) + with os_helper.EnvironmentVarGuard() as env: + env['SOURCE_DATE_EPOCH'] = '1' + + ns = self.parse_args(['--randomize']) + regrtest = main.Regrtest(ns) + self.assertFalse(regrtest.randomize) + self.assertIsNone(regrtest.random_seed) + def test_randseed(self): - ns = libregrtest._parse_args(['--randseed', '12345']) + ns = self.parse_args(['--randseed', '12345']) self.assertEqual(ns.random_seed, 12345) self.assertTrue(ns.randomize) self.checkError(['--randseed'], 'expected one argument') @@ -146,7 +166,7 @@ def test_randseed(self): def test_fromfile(self): for opt in '-f', '--fromfile': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt, 'foo']) + ns = self.parse_args([opt, 'foo']) self.assertEqual(ns.fromfile, 'foo') self.checkError([opt], 'expected one argument') self.checkError([opt, 'foo', '-s'], "don't go together") @@ -154,20 +174,20 @@ def test_fromfile(self): def test_exclude(self): for opt in '-x', '--exclude': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt]) + ns = self.parse_args([opt]) self.assertTrue(ns.exclude) def test_single(self): for opt in '-s', '--single': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt]) + ns = self.parse_args([opt]) self.assertTrue(ns.single) self.checkError([opt, '-f', 'foo'], "don't go together") def test_ignore(self): for opt in '-i', '--ignore': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt, 'pattern']) + ns = self.parse_args([opt, 'pattern']) self.assertEqual(ns.ignore_tests, ['pattern']) self.checkError([opt], 'expected one argument') @@ -177,7 +197,7 @@ def test_ignore(self): print('matchfile2', file=fp) filename = os.path.abspath(os_helper.TESTFN) - ns = libregrtest._parse_args(['-m', 'match', + ns = self.parse_args(['-m', 'match', '--ignorefile', filename]) self.assertEqual(ns.ignore_tests, ['matchfile1', 'matchfile2']) @@ -185,11 +205,11 @@ def test_ignore(self): def test_match(self): for opt in '-m', '--match': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt, 'pattern']) + ns = self.parse_args([opt, 'pattern']) self.assertEqual(ns.match_tests, ['pattern']) self.checkError([opt], 'expected one argument') - ns = libregrtest._parse_args(['-m', 'pattern1', + ns = self.parse_args(['-m', 'pattern1', '-m', 'pattern2']) self.assertEqual(ns.match_tests, ['pattern1', 'pattern2']) @@ -199,7 +219,7 @@ def test_match(self): print('matchfile2', file=fp) filename = os.path.abspath(os_helper.TESTFN) - ns = libregrtest._parse_args(['-m', 'match', + ns = self.parse_args(['-m', 'match', '--matchfile', filename]) self.assertEqual(ns.match_tests, ['match', 'matchfile1', 'matchfile2']) @@ -207,65 +227,65 @@ def test_match(self): def test_failfast(self): for opt in '-G', '--failfast': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt, '-v']) + ns = self.parse_args([opt, '-v']) self.assertTrue(ns.failfast) - ns = libregrtest._parse_args([opt, '-W']) + ns = self.parse_args([opt, '-W']) self.assertTrue(ns.failfast) self.checkError([opt], '-G/--failfast needs either -v or -W') def test_use(self): for opt in '-u', '--use': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt, 'gui,network']) + ns = self.parse_args([opt, 'gui,network']) self.assertEqual(ns.use_resources, ['gui', 'network']) - ns = libregrtest._parse_args([opt, 'gui,none,network']) + ns = self.parse_args([opt, 'gui,none,network']) self.assertEqual(ns.use_resources, ['network']) - expected = list(libregrtest.ALL_RESOURCES) + expected = list(cmdline.ALL_RESOURCES) expected.remove('gui') - ns = libregrtest._parse_args([opt, 'all,-gui']) + ns = self.parse_args([opt, 'all,-gui']) self.assertEqual(ns.use_resources, expected) self.checkError([opt], 'expected one argument') self.checkError([opt, 'foo'], 'invalid resource') # all + a resource not part of "all" - ns = libregrtest._parse_args([opt, 'all,tzdata']) + ns = self.parse_args([opt, 'all,tzdata']) self.assertEqual(ns.use_resources, - list(libregrtest.ALL_RESOURCES) + ['tzdata']) + list(cmdline.ALL_RESOURCES) + ['tzdata']) # test another resource which is not part of "all" - ns = libregrtest._parse_args([opt, 'extralargefile']) + ns = self.parse_args([opt, 'extralargefile']) self.assertEqual(ns.use_resources, ['extralargefile']) def test_memlimit(self): for opt in '-M', '--memlimit': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt, '4G']) + ns = self.parse_args([opt, '4G']) self.assertEqual(ns.memlimit, '4G') self.checkError([opt], 'expected one argument') def test_testdir(self): - ns = libregrtest._parse_args(['--testdir', 'foo']) + ns = self.parse_args(['--testdir', 'foo']) self.assertEqual(ns.testdir, os.path.join(os_helper.SAVEDCWD, 'foo')) self.checkError(['--testdir'], 'expected one argument') def test_runleaks(self): for opt in '-L', '--runleaks': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt]) + ns = self.parse_args([opt]) self.assertTrue(ns.runleaks) def test_huntrleaks(self): for opt in '-R', '--huntrleaks': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt, ':']) + ns = self.parse_args([opt, ':']) self.assertEqual(ns.huntrleaks, (5, 4, 'reflog.txt')) - ns = libregrtest._parse_args([opt, '6:']) + ns = self.parse_args([opt, '6:']) self.assertEqual(ns.huntrleaks, (6, 4, 'reflog.txt')) - ns = libregrtest._parse_args([opt, ':3']) + ns = self.parse_args([opt, ':3']) self.assertEqual(ns.huntrleaks, (5, 3, 'reflog.txt')) - ns = libregrtest._parse_args([opt, '6:3:leaks.log']) + ns = self.parse_args([opt, '6:3:leaks.log']) self.assertEqual(ns.huntrleaks, (6, 3, 'leaks.log')) self.checkError([opt], 'expected one argument') self.checkError([opt, '6'], @@ -276,7 +296,7 @@ def test_huntrleaks(self): def test_multiprocess(self): for opt in '-j', '--multiprocess': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt, '2']) + ns = self.parse_args([opt, '2']) self.assertEqual(ns.use_mp, 2) self.checkError([opt], 'expected one argument') self.checkError([opt, 'foo'], 'invalid int value') @@ -286,13 +306,13 @@ def test_multiprocess(self): def test_coverage(self): for opt in '-T', '--coverage': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt]) + ns = self.parse_args([opt]) self.assertTrue(ns.trace) def test_coverdir(self): for opt in '-D', '--coverdir': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt, 'foo']) + ns = self.parse_args([opt, 'foo']) self.assertEqual(ns.coverdir, os.path.join(os_helper.SAVEDCWD, 'foo')) self.checkError([opt], 'expected one argument') @@ -300,13 +320,13 @@ def test_coverdir(self): def test_nocoverdir(self): for opt in '-N', '--nocoverdir': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt]) + ns = self.parse_args([opt]) self.assertIsNone(ns.coverdir) def test_threshold(self): for opt in '-t', '--threshold': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt, '1000']) + ns = self.parse_args([opt, '1000']) self.assertEqual(ns.threshold, 1000) self.checkError([opt], 'expected one argument') self.checkError([opt, 'foo'], 'invalid int value') @@ -315,7 +335,7 @@ def test_nowindows(self): for opt in '-n', '--nowindows': with self.subTest(opt=opt): with contextlib.redirect_stderr(io.StringIO()) as stderr: - ns = libregrtest._parse_args([opt]) + ns = self.parse_args([opt]) self.assertTrue(ns.nowindows) err = stderr.getvalue() self.assertIn('the --nowindows (-n) option is deprecated', err) @@ -323,39 +343,39 @@ def test_nowindows(self): def test_forever(self): for opt in '-F', '--forever': with self.subTest(opt=opt): - ns = libregrtest._parse_args([opt]) + ns = self.parse_args([opt]) self.assertTrue(ns.forever) def test_unrecognized_argument(self): self.checkError(['--xxx'], 'usage:') def test_long_option__partial(self): - ns = libregrtest._parse_args(['--qui']) + ns = self.parse_args(['--qui']) self.assertTrue(ns.quiet) self.assertEqual(ns.verbose, 0) def test_two_options(self): - ns = libregrtest._parse_args(['--quiet', '--exclude']) + ns = self.parse_args(['--quiet', '--exclude']) self.assertTrue(ns.quiet) self.assertEqual(ns.verbose, 0) self.assertTrue(ns.exclude) def test_option_with_empty_string_value(self): - ns = libregrtest._parse_args(['--start', '']) + ns = self.parse_args(['--start', '']) self.assertEqual(ns.start, '') def test_arg(self): - ns = libregrtest._parse_args(['foo']) + ns = self.parse_args(['foo']) self.assertEqual(ns.args, ['foo']) def test_option_and_arg(self): - ns = libregrtest._parse_args(['--quiet', 'foo']) + ns = self.parse_args(['--quiet', 'foo']) self.assertTrue(ns.quiet) self.assertEqual(ns.verbose, 0) self.assertEqual(ns.args, ['foo']) def test_arg_option_arg(self): - ns = libregrtest._parse_args(['test_unaryop', '-v', 'test_binop']) + ns = self.parse_args(['test_unaryop', '-v', 'test_binop']) self.assertEqual(ns.verbose, 1) self.assertEqual(ns.args, ['test_unaryop', 'test_binop']) @@ -363,6 +383,62 @@ def test_unknown_option(self): self.checkError(['--unknown-option'], 'unrecognized arguments: --unknown-option') + def check_ci_mode(self, args, use_resources, rerun=True): + ns = cmdline._parse_args(args) + + # Check Regrtest attributes which are more reliable than Namespace + # which has an unclear API + regrtest = main.Regrtest(ns) + self.assertEqual(regrtest.num_workers, -1) + self.assertEqual(regrtest.want_rerun, rerun) + self.assertTrue(regrtest.randomize) + self.assertIsInstance(regrtest.random_seed, int) + self.assertTrue(regrtest.fail_env_changed) + self.assertTrue(regrtest.fail_rerun) + self.assertTrue(regrtest.print_slowest) + self.assertTrue(regrtest.output_on_failure) + self.assertEqual(sorted(regrtest.use_resources), sorted(use_resources)) + return regrtest + + def test_fast_ci(self): + args = ['--fast-ci'] + use_resources = sorted(cmdline.ALL_RESOURCES) + use_resources.remove('cpu') + regrtest = self.check_ci_mode(args, use_resources) + self.assertEqual(regrtest.timeout, 10 * 60) + + def test_fast_ci_python_cmd(self): + args = ['--fast-ci', '--python', 'python -X dev'] + use_resources = sorted(cmdline.ALL_RESOURCES) + use_resources.remove('cpu') + regrtest = self.check_ci_mode(args, use_resources, rerun=False) + self.assertEqual(regrtest.timeout, 10 * 60) + self.assertEqual(regrtest.python_cmd, ('python', '-X', 'dev')) + + def test_fast_ci_resource(self): + # it should be possible to override resources + args = ['--fast-ci', '-u', 'network'] + use_resources = ['network'] + self.check_ci_mode(args, use_resources) + + def test_slow_ci(self): + args = ['--slow-ci'] + use_resources = sorted(cmdline.ALL_RESOURCES) + regrtest = self.check_ci_mode(args, use_resources) + self.assertEqual(regrtest.timeout, 20 * 60) + + def test_dont_add_python_opts(self): + args = ['--dont-add-python-opts'] + ns = cmdline._parse_args(args) + self.assertFalse(ns._add_python_opts) + + +@dataclasses.dataclass(slots=True) +class Rerun: + name: str + match: str | None + success: bool + class BaseTestCase(unittest.TestCase): TEST_UNIQUE_ID = 1 @@ -411,10 +487,12 @@ def regex_search(self, regex, output): self.fail("%r not found in %r" % (regex, output)) return match - def check_line(self, output, regex, full=False): + def check_line(self, output, pattern, full=False, regex=True): + if not regex: + pattern = re.escape(pattern) if full: - regex += '\n' - regex = re.compile(r'^' + regex, re.MULTILINE) + pattern += '\n' + regex = re.compile(r'^' + pattern, re.MULTILINE) self.assertRegex(output, regex) def parse_executed_tests(self, output): @@ -423,13 +501,14 @@ def parse_executed_tests(self, output): parser = re.finditer(regex, output, re.MULTILINE) return list(match.group(1) for match in parser) - def check_executed_tests(self, output, tests, skipped=(), failed=(), + def check_executed_tests(self, output, tests, *, stats, + skipped=(), failed=(), env_changed=(), omitted=(), - rerun={}, run_no_tests=(), + rerun=None, run_no_tests=(), resource_denied=(), - randomize=False, interrupted=False, + randomize=False, parallel=False, interrupted=False, fail_env_changed=False, - *, stats): + forever=False, filtered=False): if isinstance(tests, str): tests = [tests] if isinstance(skipped, str): @@ -446,12 +525,23 @@ def check_executed_tests(self, output, tests, skipped=(), failed=(), run_no_tests = [run_no_tests] if isinstance(stats, int): stats = TestStats(stats) + if parallel: + randomize = True + + rerun_failed = [] + if rerun is not None and not env_changed: + failed = [rerun.name] + if not rerun.success: + rerun_failed.append(rerun.name) executed = self.parse_executed_tests(output) + total_tests = list(tests) + if rerun is not None: + total_tests.append(rerun.name) if randomize: - self.assertEqual(set(executed), set(tests), output) + self.assertEqual(set(executed), set(total_tests), output) else: - self.assertEqual(executed, tests, output) + self.assertEqual(executed, total_tests, output) def plural(count): return 's' if count != 1 else '' @@ -467,12 +557,17 @@ def list_regex(line_format, tests): regex = list_regex('%s test%s skipped', skipped) self.check_line(output, regex) + if resource_denied: + regex = list_regex(r'%s test%s skipped \(resource denied\)', resource_denied) + self.check_line(output, regex) + if failed: regex = list_regex('%s test%s failed', failed) self.check_line(output, regex) if env_changed: - regex = list_regex('%s test%s altered the execution environment', + regex = list_regex(r'%s test%s altered the execution environment ' + r'\(env changed\)', env_changed) self.check_line(output, regex) @@ -480,32 +575,36 @@ def list_regex(line_format, tests): regex = list_regex('%s test%s omitted', omitted) self.check_line(output, regex) - if rerun: - regex = list_regex('%s re-run test%s', rerun.keys()) + if rerun is not None: + regex = list_regex('%s re-run test%s', [rerun.name]) self.check_line(output, regex) - regex = LOG_PREFIX + r"Re-running failed tests in verbose mode" + regex = LOG_PREFIX + r"Re-running 1 failed tests in verbose mode" + self.check_line(output, regex) + regex = fr"Re-running {rerun.name} in verbose mode" + if rerun.match: + regex = fr"{regex} \(matching: {rerun.match}\)" self.check_line(output, regex) - for name, match in rerun.items(): - regex = LOG_PREFIX + f"Re-running {name} in verbose mode \\(matching: {match}\\)" - self.check_line(output, regex) if run_no_tests: regex = list_regex('%s test%s run no tests', run_no_tests) self.check_line(output, regex) - good = (len(tests) - len(skipped) - len(failed) + good = (len(tests) - len(skipped) - len(resource_denied) - len(failed) - len(omitted) - len(env_changed) - len(run_no_tests)) if good: - regex = r'%s test%s OK\.$' % (good, plural(good)) - if not skipped and not failed and good > 1: + regex = r'%s test%s OK\.' % (good, plural(good)) + if not skipped and not failed and (rerun is None or rerun.success) and good > 1: regex = 'All %s' % regex - self.check_line(output, regex) + self.check_line(output, regex, full=True) if interrupted: self.check_line(output, 'Test suite interrupted by signal SIGINT.') # Total tests - parts = [f'run={stats.tests_run:,}'] + text = f'run={stats.tests_run:,}' + if filtered: + text = fr'{text} \(filtered\)' + parts = [text] if stats.failures: parts.append(f'failures={stats.failures:,}') if stats.skipped: @@ -514,44 +613,57 @@ def list_regex(line_format, tests): self.check_line(output, line, full=True) # Total test files - report = [f'success={good}'] - if failed: - report.append(f'failed={len(failed)}') - if env_changed: - report.append(f'env_changed={len(env_changed)}') - if skipped: - report.append(f'skipped={len(skipped)}') - if resource_denied: - report.append(f'resource_denied={len(resource_denied)}') - if rerun: - report.append(f'rerun={len(rerun)}') - if run_no_tests: - report.append(f'run_no_tests={len(run_no_tests)}') + run = len(total_tests) - len(resource_denied) + if rerun is not None: + total_failed = len(rerun_failed) + total_rerun = 1 + else: + total_failed = len(failed) + total_rerun = 0 + if interrupted: + run = 0 + text = f'run={run}' + if not forever: + text = f'{text}/{len(tests)}' + if filtered: + text = fr'{text} \(filtered\)' + report = [text] + for name, ntest in ( + ('failed', total_failed), + ('env_changed', len(env_changed)), + ('skipped', len(skipped)), + ('resource_denied', len(resource_denied)), + ('rerun', total_rerun), + ('run_no_tests', len(run_no_tests)), + ): + if ntest: + report.append(f'{name}={ntest}') line = fr'Total test files: {" ".join(report)}' self.check_line(output, line, full=True) # Result - result = [] + state = [] if failed: - result.append('FAILURE') + state.append('FAILURE') elif fail_env_changed and env_changed: - result.append('ENV CHANGED') + state.append('ENV CHANGED') if interrupted: - result.append('INTERRUPTED') - if not any((good, result, failed, interrupted, skipped, + state.append('INTERRUPTED') + if not any((good, failed, interrupted, skipped, env_changed, fail_env_changed)): - result.append("NO TESTS RAN") - elif not result: - result.append('SUCCESS') - result = ', '.join(result) - if rerun: - result = 'FAILURE then %s' % result - self.check_line(output, f'Result: {result}', full=True) + state.append("NO TESTS RAN") + elif not state: + state.append('SUCCESS') + state = ', '.join(state) + if rerun is not None: + new_state = 'SUCCESS' if rerun.success else 'FAILURE' + state = f'{state} then {new_state}' + self.check_line(output, f'Result: {state}', full=True) def parse_random_seed(self, output): match = self.regex_search(r'Using random seed ([0-9]+)', output) randseed = int(match.group(1)) - self.assertTrue(0 <= randseed <= 10000000, randseed) + self.assertTrue(0 <= randseed, randseed) return randseed def run_command(self, args, input=None, exitcode=0, **kw): @@ -560,18 +672,18 @@ def run_command(self, args, input=None, exitcode=0, **kw): if 'stderr' not in kw: kw['stderr'] = subprocess.STDOUT proc = subprocess.run(args, - universal_newlines=True, + text=True, input=input, stdout=subprocess.PIPE, **kw) if proc.returncode != exitcode: - msg = ("Command %s failed with exit code %s\n" + msg = ("Command %s failed with exit code %s, but exit code %s expected!\n" "\n" "stdout:\n" "---\n" "%s\n" "---\n" - % (str(args), proc.returncode, proc.stdout)) + % (str(args), proc.returncode, exitcode, proc.stdout)) if proc.stderr: msg += ("\n" "stderr:\n" @@ -583,7 +695,11 @@ def run_command(self, args, input=None, exitcode=0, **kw): return proc def run_python(self, args, **kw): - args = [sys.executable, '-X', 'faulthandler', '-I', *args] + extraargs = [] + if 'uops' in sys._xoptions: + # Pass -X uops along + extraargs.extend(['-X', 'uops']) + args = [sys.executable, *extraargs, '-X', 'faulthandler', '-I', *args] proc = self.run_command(args, **kw) return proc.stdout @@ -638,8 +754,8 @@ def check_output(self, output): self.check_executed_tests(output, self.tests, randomize=True, stats=len(self.tests)) - def run_tests(self, args): - output = self.run_python(args) + def run_tests(self, args, env=None): + output = self.run_python(args, env=env) self.check_output(output) def test_script_regrtest(self): @@ -680,14 +796,6 @@ def test_script_autotest(self): args = [*self.python_args, script, *self.regrtest_args, *self.tests] self.run_tests(args) - @unittest.skipUnless(sysconfig.is_python_build(), - 'run_tests.py script is not installed') - def test_tools_script_run_tests(self): - # Tools/scripts/run_tests.py - script = os.path.join(ROOT_DIR, 'Tools', 'scripts', 'run_tests.py') - args = [script, *self.regrtest_args, *self.tests] - self.run_tests(args) - def run_batch(self, *args): proc = self.run_command(args) self.check_output(proc.stdout) @@ -705,7 +813,7 @@ def test_tools_buildbot_test(self): test_args.append('-arm32') # 32-bit ARM build elif platform.architecture()[0] == '64bit': test_args.append('-x64') # 64-bit build - if not Py_DEBUG: + if not support.Py_DEBUG: test_args.append('+d') # Release build, use python.exe self.run_batch(script, *test_args, *self.tests) @@ -722,7 +830,7 @@ def test_pcbuild_rt(self): rt_args.append('-arm32') # 32-bit ARM build elif platform.architecture()[0] == '64bit': rt_args.append('-x64') # 64-bit build - if Py_DEBUG: + if support.Py_DEBUG: rt_args.append('-d') # Debug build, use python_d.exe self.run_batch(script, *rt_args, *self.regrtest_args, *self.tests) @@ -736,6 +844,40 @@ def run_tests(self, *testargs, **kw): cmdargs = ['-m', 'test', '--testdir=%s' % self.tmptestdir, *testargs] return self.run_python(cmdargs, **kw) + def test_success(self): + code = textwrap.dedent(""" + import unittest + + class PassingTests(unittest.TestCase): + def test_test1(self): + pass + + def test_test2(self): + pass + + def test_test3(self): + pass + """) + tests = [self.create_test(f'ok{i}', code=code) for i in range(1, 6)] + + output = self.run_tests(*tests) + self.check_executed_tests(output, tests, + stats=3 * len(tests)) + + def test_skip(self): + code = textwrap.dedent(""" + import unittest + raise unittest.SkipTest("nope") + """) + test_ok = self.create_test('ok') + test_skip = self.create_test('skip', code=code) + tests = [test_ok, test_skip] + + output = self.run_tests(*tests) + self.check_executed_tests(output, tests, + skipped=[test_skip], + stats=1) + def test_failing_test(self): # test a failing test code = textwrap.dedent(""" @@ -775,14 +917,12 @@ def test_pass(self): # -u audio: 1 resource enabled output = self.run_tests('-uaudio', *test_names) self.check_executed_tests(output, test_names, - skipped=tests['network'], resource_denied=tests['network'], stats=1) # no option: 0 resources enabled - output = self.run_tests(*test_names) + output = self.run_tests(*test_names, exitcode=EXITCODE_NO_TESTS_RAN) self.check_executed_tests(output, test_names, - skipped=test_names, resource_denied=test_names, stats=0) @@ -810,6 +950,10 @@ def test_random(self): test_random2 = int(match.group(1)) self.assertEqual(test_random2, test_random) + # check that random.seed is used by default + output = self.run_tests(test, exitcode=EXITCODE_NO_TESTS_RAN) + self.assertIsInstance(self.parse_random_seed(output), int) + def test_fromfile(self): # test --fromfile tests = [self.create_test() for index in range(5)] @@ -928,16 +1072,32 @@ def test_run(self): builtins.__dict__['RUN'] = 1 """) test = self.create_test('forever', code=code) + + # --forever output = self.run_tests('--forever', test, exitcode=EXITCODE_BAD_TEST) self.check_executed_tests(output, [test]*3, failed=test, - stats=TestStats(1, 1)) - - def check_leak(self, code, what): + stats=TestStats(3, 1), + forever=True) + + # --forever --rerun + output = self.run_tests('--forever', '--rerun', test, exitcode=0) + self.check_executed_tests(output, [test]*3, + rerun=Rerun(test, + match='test_run', + success=True), + stats=TestStats(4, 1), + forever=True) + + def check_leak(self, code, what, *, run_workers=False): test = self.create_test('huntrleaks', code=code) filename = 'reflog.txt' self.addCleanup(os_helper.unlink, filename) - output = self.run_tests('--huntrleaks', '3:3:', test, + cmd = ['--huntrleaks', '3:3:'] + if run_workers: + cmd.append('-j1') + cmd.append(test) + output = self.run_tests(*cmd, exitcode=EXITCODE_BAD_TEST, stderr=subprocess.STDOUT) self.check_executed_tests(output, [test], failed=test, stats=1) @@ -952,8 +1112,8 @@ def check_leak(self, code, what): reflog = fp.read() self.assertIn(line2, reflog) - @unittest.skipUnless(Py_DEBUG, 'need a debug build') - def test_huntrleaks(self): + @unittest.skipUnless(support.Py_DEBUG, 'need a debug build') + def check_huntrleaks(self, *, run_workers: bool): # test --huntrleaks code = textwrap.dedent(""" import unittest @@ -964,9 +1124,15 @@ class RefLeakTest(unittest.TestCase): def test_leak(self): GLOBAL_LIST.append(object()) """) - self.check_leak(code, 'references') + self.check_leak(code, 'references', run_workers=run_workers) - @unittest.skipUnless(Py_DEBUG, 'need a debug build') + def test_huntrleaks(self): + self.check_huntrleaks(run_workers=False) + + def test_huntrleaks_mp(self): + self.check_huntrleaks(run_workers=True) + + @unittest.skipUnless(support.Py_DEBUG, 'need a debug build') def test_huntrleaks_fd_leak(self): # test --huntrleaks for file descriptor leak code = textwrap.dedent(""" @@ -1022,7 +1188,7 @@ def test_crashed(self): tests = [crash_test] output = self.run_tests("-j2", *tests, exitcode=EXITCODE_BAD_TEST) self.check_executed_tests(output, tests, failed=crash_test, - randomize=True, stats=0) + parallel=True, stats=0) def parse_methods(self, output): regex = re.compile("^(test[^ ]+).*ok$", flags=re.MULTILINE) @@ -1042,8 +1208,6 @@ def test_method3(self): def test_method4(self): pass """) - all_methods = ['test_method1', 'test_method2', - 'test_method3', 'test_method4'] testname = self.create_test(code=code) # only run a subset @@ -1126,6 +1290,15 @@ def test_env_changed(self): self.check_executed_tests(output, [testname], env_changed=testname, fail_env_changed=True, stats=1) + # rerun + output = self.run_tests("--rerun", testname) + self.check_executed_tests(output, [testname], + env_changed=testname, + rerun=Rerun(testname, + match=None, + success=True), + stats=2) + def test_rerun_fail(self): # FAILURE then FAILURE code = textwrap.dedent(""" @@ -1141,33 +1314,55 @@ def test_fail_always(self): """) testname = self.create_test(code=code) - output = self.run_tests("-w", testname, exitcode=EXITCODE_BAD_TEST) + output = self.run_tests("--rerun", testname, exitcode=EXITCODE_BAD_TEST) self.check_executed_tests(output, [testname], - failed=testname, - rerun={testname: "test_fail_always"}, - stats=TestStats(1, 1)) + rerun=Rerun(testname, + "test_fail_always", + success=False), + stats=TestStats(3, 2)) def test_rerun_success(self): # FAILURE then SUCCESS - code = textwrap.dedent(""" - import builtins + marker_filename = os.path.abspath("regrtest_marker_filename") + self.addCleanup(os_helper.unlink, marker_filename) + self.assertFalse(os.path.exists(marker_filename)) + + code = textwrap.dedent(f""" + import os.path import unittest + marker_filename = {marker_filename!r} + class Tests(unittest.TestCase): def test_succeed(self): return def test_fail_once(self): - if not hasattr(builtins, '_test_failed'): - builtins._test_failed = True + if not os.path.exists(marker_filename): + open(marker_filename, "w").close() self.fail("bug") """) testname = self.create_test(code=code) - output = self.run_tests("-w", testname, exitcode=0) + # FAILURE then SUCCESS => exit code 0 + output = self.run_tests("--rerun", testname, exitcode=0) self.check_executed_tests(output, [testname], - rerun={testname: "test_fail_once"}, - stats=1) + rerun=Rerun(testname, + match="test_fail_once", + success=True), + stats=TestStats(3, 1)) + os_helper.unlink(marker_filename) + + # with --fail-rerun, exit code EXITCODE_RERUN_FAIL + # on "FAILURE then SUCCESS" state. + output = self.run_tests("--rerun", "--fail-rerun", testname, + exitcode=EXITCODE_RERUN_FAIL) + self.check_executed_tests(output, [testname], + rerun=Rerun(testname, + match="test_fail_once", + success=True), + stats=TestStats(3, 1)) + os_helper.unlink(marker_filename) def test_rerun_setup_class_hook_failure(self): # FAILURE then FAILURE @@ -1184,10 +1379,12 @@ def test_success(self): """) testname = self.create_test(code=code) - output = self.run_tests("-w", testname, exitcode=EXITCODE_BAD_TEST) + output = self.run_tests("--rerun", testname, exitcode=EXITCODE_BAD_TEST) self.check_executed_tests(output, testname, failed=[testname], - rerun={testname: "ExampleTests"}, + rerun=Rerun(testname, + match="ExampleTests", + success=False), stats=0) def test_rerun_teardown_class_hook_failure(self): @@ -1205,11 +1402,13 @@ def test_success(self): """) testname = self.create_test(code=code) - output = self.run_tests("-w", testname, exitcode=EXITCODE_BAD_TEST) + output = self.run_tests("--rerun", testname, exitcode=EXITCODE_BAD_TEST) self.check_executed_tests(output, testname, failed=[testname], - rerun={testname: "ExampleTests"}, - stats=1) + rerun=Rerun(testname, + match="ExampleTests", + success=False), + stats=2) def test_rerun_setup_module_hook_failure(self): # FAILURE then FAILURE @@ -1225,10 +1424,12 @@ def test_success(self): """) testname = self.create_test(code=code) - output = self.run_tests("-w", testname, exitcode=EXITCODE_BAD_TEST) + output = self.run_tests("--rerun", testname, exitcode=EXITCODE_BAD_TEST) self.check_executed_tests(output, testname, failed=[testname], - rerun={testname: testname}, + rerun=Rerun(testname, + match=None, + success=False), stats=0) def test_rerun_teardown_module_hook_failure(self): @@ -1245,11 +1446,13 @@ def test_success(self): """) testname = self.create_test(code=code) - output = self.run_tests("-w", testname, exitcode=EXITCODE_BAD_TEST) - self.check_executed_tests(output, testname, + output = self.run_tests("--rerun", testname, exitcode=EXITCODE_BAD_TEST) + self.check_executed_tests(output, [testname], failed=[testname], - rerun={testname: testname}, - stats=1) + rerun=Rerun(testname, + match=None, + success=False), + stats=2) def test_rerun_setup_hook_failure(self): # FAILURE then FAILURE @@ -1265,11 +1468,13 @@ def test_success(self): """) testname = self.create_test(code=code) - output = self.run_tests("-w", testname, exitcode=EXITCODE_BAD_TEST) + output = self.run_tests("--rerun", testname, exitcode=EXITCODE_BAD_TEST) self.check_executed_tests(output, testname, failed=[testname], - rerun={testname: "test_success"}, - stats=1) + rerun=Rerun(testname, + match="test_success", + success=False), + stats=2) def test_rerun_teardown_hook_failure(self): # FAILURE then FAILURE @@ -1285,11 +1490,13 @@ def test_success(self): """) testname = self.create_test(code=code) - output = self.run_tests("-w", testname, exitcode=EXITCODE_BAD_TEST) + output = self.run_tests("--rerun", testname, exitcode=EXITCODE_BAD_TEST) self.check_executed_tests(output, testname, failed=[testname], - rerun={testname: "test_success"}, - stats=1) + rerun=Rerun(testname, + match="test_success", + success=False), + stats=2) def test_rerun_async_setup_hook_failure(self): # FAILURE then FAILURE @@ -1305,11 +1512,12 @@ async def test_success(self): """) testname = self.create_test(code=code) - output = self.run_tests("-w", testname, exitcode=EXITCODE_BAD_TEST) + output = self.run_tests("--rerun", testname, exitcode=EXITCODE_BAD_TEST) self.check_executed_tests(output, testname, - failed=[testname], - rerun={testname: "test_success"}, - stats=1) + rerun=Rerun(testname, + match="test_success", + success=False), + stats=2) def test_rerun_async_teardown_hook_failure(self): # FAILURE then FAILURE @@ -1325,11 +1533,13 @@ async def test_success(self): """) testname = self.create_test(code=code) - output = self.run_tests("-w", testname, exitcode=EXITCODE_BAD_TEST) + output = self.run_tests("--rerun", testname, exitcode=EXITCODE_BAD_TEST) self.check_executed_tests(output, testname, failed=[testname], - rerun={testname: "test_success"}, - stats=1) + rerun=Rerun(testname, + match="test_success", + success=False), + stats=2) def test_no_tests_ran(self): code = textwrap.dedent(""" @@ -1345,7 +1555,7 @@ def test_bug(self): exitcode=EXITCODE_NO_TESTS_RAN) self.check_executed_tests(output, [testname], run_no_tests=testname, - stats=0) + stats=0, filtered=True) def test_no_tests_ran_skip(self): code = textwrap.dedent(""" @@ -1376,7 +1586,7 @@ def test_bug(self): exitcode=EXITCODE_NO_TESTS_RAN) self.check_executed_tests(output, [testname, testname2], run_no_tests=[testname, testname2], - stats=0) + stats=0, filtered=True) def test_no_test_ran_some_test_exist_some_not(self): code = textwrap.dedent(""" @@ -1400,7 +1610,7 @@ def test_other_bug(self): "-m", "test_other_bug", exitcode=0) self.check_executed_tests(output, [testname, testname2], run_no_tests=[testname], - stats=1) + stats=1, filtered=True) @support.cpython_only def test_uncollectable(self): @@ -1611,16 +1821,15 @@ def test_leak_tmp_file(self): self.check_executed_tests(output, testnames, env_changed=testnames, fail_env_changed=True, - randomize=True, + parallel=True, stats=len(testnames)) for testname in testnames: self.assertIn(f"Warning -- {testname} leaked temporary " f"files (1): mytmpfile", output) - def test_mp_decode_error(self): - # gh-101634: If a worker stdout cannot be decoded, report a failed test - # and a non-zero exit code. + def test_worker_decode_error(self): + # gh-109425: Use "backslashreplace" error handler to decode stdout. if sys.platform == 'win32': encoding = locale.getencoding() else: @@ -1628,34 +1837,46 @@ def test_mp_decode_error(self): if encoding is None: encoding = sys.__stdout__.encoding if encoding is None: - self.skipTest(f"cannot get regrtest worker encoding") - - nonascii = b"byte:\xa0\xa9\xff\n" + self.skipTest("cannot get regrtest worker encoding") + + nonascii = bytes(ch for ch in range(128, 256)) + corrupted_output = b"nonascii:%s\n" % (nonascii,) + # gh-108989: On Windows, assertion errors are written in UTF-16: when + # decoded each letter is follow by a NUL character. + assertion_failed = 'Assertion failed: tstate_is_alive(tstate)\n' + corrupted_output += assertion_failed.encode('utf-16-le') try: - nonascii.decode(encoding) + corrupted_output.decode(encoding) except UnicodeDecodeError: pass else: - self.skipTest(f"{encoding} can decode non-ASCII bytes {nonascii!a}") + self.skipTest(f"{encoding} can decode non-ASCII bytes") + + expected_line = corrupted_output.decode(encoding, 'backslashreplace') code = textwrap.dedent(fr""" import sys + import unittest + + class Tests(unittest.TestCase): + def test_pass(self): + pass + # bytes which cannot be decoded from UTF-8 - nonascii = {nonascii!a} - sys.stdout.buffer.write(nonascii) + corrupted_output = {corrupted_output!a} + sys.stdout.buffer.write(corrupted_output) sys.stdout.buffer.flush() """) testname = self.create_test(code=code) - output = self.run_tests("--fail-env-changed", "-v", "-j1", testname, - exitcode=EXITCODE_BAD_TEST) + output = self.run_tests("--fail-env-changed", "-v", "-j1", testname) self.check_executed_tests(output, [testname], - failed=[testname], - randomize=True, - stats=0) + parallel=True, + stats=1) + self.check_line(output, expected_line, regex=False) def test_doctest(self): - code = textwrap.dedent(fr''' + code = textwrap.dedent(r''' import doctest import sys from test import support @@ -1690,9 +1911,169 @@ def load_tests(loader, tests, pattern): exitcode=EXITCODE_BAD_TEST) self.check_executed_tests(output, [testname], failed=[testname], - randomize=True, + parallel=True, stats=TestStats(1, 1, 0)) + def _check_random_seed(self, run_workers: bool): + # gh-109276: When -r/--randomize is used, random.seed() is called + # with the same random seed before running each test file. + code = textwrap.dedent(r''' + import random + import unittest + + class RandomSeedTest(unittest.TestCase): + def test_randint(self): + numbers = [random.randint(0, 1000) for _ in range(10)] + print(f"Random numbers: {numbers}") + ''') + tests = [self.create_test(name=f'test_random{i}', code=code) + for i in range(1, 3+1)] + + random_seed = 856_656_202 + cmd = ["--randomize", f"--randseed={random_seed}"] + if run_workers: + # run as many worker processes than the number of tests + cmd.append(f'-j{len(tests)}') + cmd.extend(tests) + output = self.run_tests(*cmd) + + random.seed(random_seed) + # Make the assumption that nothing consume entropy between libregrest + # setup_tests() which calls random.seed() and RandomSeedTest calling + # random.randint(). + numbers = [random.randint(0, 1000) for _ in range(10)] + expected = f"Random numbers: {numbers}" + + regex = r'^Random numbers: .*$' + matches = re.findall(regex, output, flags=re.MULTILINE) + self.assertEqual(matches, [expected] * len(tests)) + + def test_random_seed(self): + self._check_random_seed(run_workers=False) + + def test_random_seed_workers(self): + self._check_random_seed(run_workers=True) + + def test_python_command(self): + code = textwrap.dedent(r""" + import sys + import unittest + + class WorkerTests(unittest.TestCase): + def test_dev_mode(self): + self.assertTrue(sys.flags.dev_mode) + """) + tests = [self.create_test(code=code) for _ in range(3)] + + # Custom Python command: "python -X dev" + python_cmd = [sys.executable, '-X', 'dev'] + # test.libregrtest.cmdline uses shlex.split() to parse the Python + # command line string + python_cmd = shlex.join(python_cmd) + + output = self.run_tests("--python", python_cmd, "-j0", *tests) + self.check_executed_tests(output, tests, + stats=len(tests), parallel=True) + + def check_add_python_opts(self, option): + # --fast-ci and --slow-ci add "-u -W default -bb -E" options to Python + code = textwrap.dedent(r""" + import sys + import unittest + from test import support + try: + from _testinternalcapi import get_config + except ImportError: + get_config = None + + # WASI/WASM buildbots don't use -E option + use_environment = (support.is_emscripten or support.is_wasi) + + class WorkerTests(unittest.TestCase): + @unittest.skipUnless(get_config is None, 'need get_config()') + def test_config(self): + config = get_config()['config'] + # -u option + self.assertEqual(config['buffered_stdio'], 0) + # -W default option + self.assertTrue(config['warnoptions'], ['default']) + # -bb option + self.assertTrue(config['bytes_warning'], 2) + # -E option + self.assertTrue(config['use_environment'], use_environment) + + def test_python_opts(self): + # -u option + self.assertTrue(sys.__stdout__.write_through) + self.assertTrue(sys.__stderr__.write_through) + + # -W default option + self.assertTrue(sys.warnoptions, ['default']) + + # -bb option + self.assertEqual(sys.flags.bytes_warning, 2) + + # -E option + self.assertEqual(not sys.flags.ignore_environment, + use_environment) + """) + testname = self.create_test(code=code) + + # Use directly subprocess to control the exact command line + cmd = [sys.executable, + "-m", "test", option, + f'--testdir={self.tmptestdir}', + testname] + proc = subprocess.run(cmd, + stdout=subprocess.PIPE, + stderr=subprocess.STDOUT, + text=True) + self.assertEqual(proc.returncode, 0, proc) + + def test_add_python_opts(self): + for opt in ("--fast-ci", "--slow-ci"): + with self.subTest(opt=opt): + self.check_add_python_opts(opt) + + # gh-76319: Raising SIGSEGV on Android may not cause a crash. + @unittest.skipIf(support.is_android, + 'raising SIGSEGV on Android is unreliable') + def test_worker_output_on_failure(self): + try: + from faulthandler import _sigsegv + except ImportError: + self.skipTest("need faulthandler._sigsegv") + + code = textwrap.dedent(r""" + import faulthandler + import unittest + from test import support + + class CrashTests(unittest.TestCase): + def test_crash(self): + print("just before crash!", flush=True) + + with support.SuppressCrashReport(): + faulthandler._sigsegv(True) + """) + testname = self.create_test(code=code) + + # Sanitizers must not handle SIGSEGV (ex: for test_enable_fd()) + env = dict(os.environ) + option = 'handle_segv=0' + support.set_sanitizer_env_var(env, option) + + output = self.run_tests("-j1", testname, + exitcode=EXITCODE_BAD_TEST, + env=env) + self.check_executed_tests(output, testname, + failed=[testname], + stats=0, parallel=True) + if not support.MS_WINDOWS: + exitcode = -int(signal.SIGSEGV) + self.assertIn(f"Exit code {exitcode} (SIGSEGV)", output) + self.check_line(output, "just before crash!", full=True, regex=False) + class TestUtils(unittest.TestCase): def test_format_duration(self): @@ -1717,6 +2098,46 @@ def test_format_duration(self): self.assertEqual(utils.format_duration(3 * 3600 + 1), '3 hour 1 sec') + def test_normalize_test_name(self): + normalize = normalize_test_name + self.assertEqual(normalize('test_access (test.test_os.FileTests.test_access)'), + 'test_access') + self.assertEqual(normalize('setUpClass (test.test_os.ChownFileTests)', is_error=True), + 'ChownFileTests') + self.assertEqual(normalize('test_success (test.test_bug.ExampleTests.test_success)', is_error=True), + 'test_success') + self.assertIsNone(normalize('setUpModule (test.test_x)', is_error=True)) + self.assertIsNone(normalize('tearDownModule (test.test_module)', is_error=True)) + + def test_get_signal_name(self): + for exitcode, expected in ( + (-int(signal.SIGINT), 'SIGINT'), + (-int(signal.SIGSEGV), 'SIGSEGV'), + (3221225477, "STATUS_ACCESS_VIOLATION"), + (0xC00000FD, "STATUS_STACK_OVERFLOW"), + ): + self.assertEqual(utils.get_signal_name(exitcode), expected, exitcode) + + def test_format_resources(self): + format_resources = utils.format_resources + ALL_RESOURCES = utils.ALL_RESOURCES + self.assertEqual( + format_resources(("network",)), + 'resources (1): network') + self.assertEqual( + format_resources(("audio", "decimal", "network")), + 'resources (3): audio,decimal,network') + self.assertEqual( + format_resources(ALL_RESOURCES), + 'resources: all') + self.assertEqual( + format_resources(tuple(name for name in ALL_RESOURCES + if name != "cpu")), + 'resources: all,-cpu') + self.assertEqual( + format_resources((*ALL_RESOURCES, "tzdata")), + 'resources: all,tzdata') + if __name__ == '__main__': unittest.main() diff --git a/Lib/test/test_support.py b/Lib/test/test_support.py index fec96bef4bbd22..01ceeec3c0a73c 100644 --- a/Lib/test/test_support.py +++ b/Lib/test/test_support.py @@ -31,7 +31,7 @@ def setUpClass(cls): "test.support.warnings_helper", like=".*used in test_support.*" ) cls._test_support_token = support.ignore_deprecations_from( - "test.test_support", like=".*You should NOT be seeing this.*" + __name__, like=".*You should NOT be seeing this.*" ) assert len(warnings.filters) == orig_filter_len + 2 @@ -775,7 +775,45 @@ def recursive_function(depth): else: self.fail("RecursionError was not raised") - #self.assertEqual(available, 2) + def test_parse_memlimit(self): + parse = support._parse_memlimit + KiB = 1024 + MiB = KiB * 1024 + GiB = MiB * 1024 + TiB = GiB * 1024 + self.assertEqual(parse('0k'), 0) + self.assertEqual(parse('3k'), 3 * KiB) + self.assertEqual(parse('2.4m'), int(2.4 * MiB)) + self.assertEqual(parse('4g'), int(4 * GiB)) + self.assertEqual(parse('1t'), TiB) + + for limit in ('', '3', '3.5.10k', '10x'): + with self.subTest(limit=limit): + with self.assertRaises(ValueError): + parse(limit) + + def test_set_memlimit(self): + _4GiB = 4 * 1024 ** 3 + TiB = 1024 ** 4 + old_max_memuse = support.max_memuse + old_real_max_memuse = support.real_max_memuse + try: + if sys.maxsize > 2**32: + support.set_memlimit('4g') + self.assertEqual(support.max_memuse, _4GiB) + self.assertEqual(support.real_max_memuse, _4GiB) + + big = 2**100 // TiB + support.set_memlimit(f'{big}t') + self.assertEqual(support.max_memuse, sys.maxsize) + self.assertEqual(support.real_max_memuse, big * TiB) + else: + support.set_memlimit('4g') + self.assertEqual(support.max_memuse, sys.maxsize) + self.assertEqual(support.real_max_memuse, _4GiB) + finally: + support.max_memuse = old_max_memuse + support.real_max_memuse = old_real_max_memuse def test_copy_python_src_ignore(self): # Get source directory @@ -824,7 +862,6 @@ def test_copy_python_src_ignore(self): # EnvironmentVarGuard # transient_internet # run_with_locale - # set_memlimit # bigmemtest # precisionbigmemtest # bigaddrspacetest From b2a95f694a81d75fbdbc60c111c0557edd2ea81c Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 13 Oct 2023 10:32:42 +0200 Subject: [PATCH 073/265] [3.11] gh-107450: Fix parser column offset overflow test on Windows (GH-110768) (#110809) (cherry picked from commit 05439d308740b621d03562451a7608eb725937ae) Co-authored-by: Lysandros Nikolaou Co-authored-by: Nikita Sobolev --- Lib/test/test_exceptions.py | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_exceptions.py b/Lib/test/test_exceptions.py index 7f6eaa34e11406..c7c63e5e4ae29d 100644 --- a/Lib/test/test_exceptions.py +++ b/Lib/test/test_exceptions.py @@ -318,8 +318,10 @@ def baz(): check('(yield i) = 2', 1, 2) check('def f(*):\n pass', 1, 7) - def testMemoryErrorBigSource(self): - with self.assertRaisesRegex(OverflowError, "column offset overflow"): + @support.requires_resource('cpu') + @support.bigmemtest(support._2G, memuse=1.5) + def testMemoryErrorBigSource(self, _size): + with self.assertRaises(OverflowError): exec(f"if True:\n {' ' * 2**31}print('hello world')") @cpython_only From c0a77eb4425f0ef716393b11865cc2cb79c647c6 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 13 Oct 2023 15:24:29 +0200 Subject: [PATCH 074/265] [3.11] gh-110703: Add asyncio.wait_for() change notes for 3.11 (GH-110818) (#110827) gh-110703: Add asyncio.wait_for() change notes for 3.11 (GH-110818) * Remove redundant versionchanged * Add missing versionchanged * Update Doc/library/asyncio-task.rst --------- (cherry picked from commit f81e36f700ac8c6766207fcf3bc2540692af868b) Co-authored-by: paskozdilar <53006174+paskozdilar@users.noreply.github.com> Co-authored-by: Kumar Aditya --- Doc/library/asyncio-task.rst | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/Doc/library/asyncio-task.rst b/Doc/library/asyncio-task.rst index ef17f384c1bfeb..0944aef0ab619d 100644 --- a/Doc/library/asyncio-task.rst +++ b/Doc/library/asyncio-task.rst @@ -714,9 +714,6 @@ Timeouts If the wait is cancelled, the future *aw* is also cancelled. - .. versionchanged:: 3.10 - Removed the *loop* parameter. - .. _asyncio_example_waitfor: Example:: @@ -747,6 +744,9 @@ Timeouts .. versionchanged:: 3.10 Removed the *loop* parameter. + .. versionchanged:: 3.11 + Raises :exc:`TimeoutError` instead of :exc:`asyncio.TimeoutError`. + Waiting Primitives ================== From d0ef4f83e8bb981467220047691a8bc52da8f3af Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 13 Oct 2023 15:59:25 +0200 Subject: [PATCH 075/265] [3.11] Bump sphinx-lint to 0.7.0 (GH-110830) (#110834) Bump sphinx-lint to 0.7.0 (GH-110830) (cherry picked from commit 0ed2329a1627fc8ae97b009114cd960c25567f75) Co-authored-by: Alex Waygood --- .pre-commit-config.yaml | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 1ae3dd71354e63..5d9deaa7ca8547 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -17,13 +17,12 @@ repos: types_or: [c, inc, python, rst] - repo: https://github.com/sphinx-contrib/sphinx-lint - rev: v0.6.8 + rev: v0.7.0 hooks: - id: sphinx-lint - args: [--enable=default-role] + args: [--enable=default-role, -j1] files: ^Doc/|^Misc/NEWS.d/next/ types: [rst] - require_serial: true - repo: meta hooks: From 2d615e1c260de8f3a414c20beecb4c5d238d1140 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 14 Oct 2023 08:28:52 +0200 Subject: [PATCH 076/265] [3.11] gh-101100: Fix sphinx warnings in `usage/cmdline.rst` (GH-110841) (#110856) Co-authored-by: Nikita Sobolev --- Doc/tools/.nitignore | 1 - Doc/using/cmdline.rst | 17 +++++++++-------- 2 files changed, 9 insertions(+), 9 deletions(-) diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index 34975d5312f6e7..728d378a72ab5e 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -149,7 +149,6 @@ Doc/reference/import.rst Doc/reference/simple_stmts.rst Doc/tutorial/datastructures.rst Doc/tutorial/introduction.rst -Doc/using/cmdline.rst Doc/using/windows.rst Doc/whatsnew/2.0.rst Doc/whatsnew/2.1.rst diff --git a/Doc/using/cmdline.rst b/Doc/using/cmdline.rst index 79aefdaf389de5..c1a6f50ca34979 100644 --- a/Doc/using/cmdline.rst +++ b/Doc/using/cmdline.rst @@ -103,7 +103,7 @@ source. :option:`-I` option can be used to run the script in isolated mode where :data:`sys.path` contains neither the current directory nor the user's - site-packages directory. All :envvar:`PYTHON*` environment variables are + site-packages directory. All ``PYTHON*`` environment variables are ignored, too. Many standard library modules contain code that is invoked on their execution @@ -161,7 +161,7 @@ source. :option:`-I` option can be used to run the script in isolated mode where :data:`sys.path` contains neither the script's directory nor the user's - site-packages directory. All :envvar:`PYTHON*` environment variables are + site-packages directory. All ``PYTHON*`` environment variables are ignored, too. .. audit-event:: cpython.run_file filename @@ -277,7 +277,7 @@ Miscellaneous options .. option:: -E - Ignore all :envvar:`PYTHON*` environment variables, e.g. + Ignore all ``PYTHON*`` environment variables, e.g. :envvar:`PYTHONPATH` and :envvar:`PYTHONHOME`, that might be set. See also the :option:`-P` and :option:`-I` (isolated) options. @@ -300,7 +300,7 @@ Miscellaneous options and :option:`-s` options. In isolated mode :data:`sys.path` contains neither the script's directory nor - the user's site-packages directory. All :envvar:`PYTHON*` environment + the user's site-packages directory. All ``PYTHON*`` environment variables are ignored, too. Further restrictions may be imposed to prevent the user from injecting malicious code. @@ -359,7 +359,7 @@ Miscellaneous options randomization is enabled by default. On previous versions of Python, this option turns on hash randomization, - so that the :meth:`__hash__` values of str and bytes objects + so that the :meth:`~object.__hash__` values of str and bytes objects are "salted" with an unpredictable random value. Although they remain constant within an individual Python process, they are not predictable between repeated invocations of Python. @@ -837,9 +837,10 @@ conflict. If this environment variable is set to a non-empty string, :func:`faulthandler.enable` is called at startup: install a handler for - :const:`SIGSEGV`, :const:`SIGFPE`, :const:`SIGABRT`, :const:`SIGBUS` and - :const:`SIGILL` signals to dump the Python traceback. This is equivalent to - :option:`-X` ``faulthandler`` option. + :const:`~signal.SIGSEGV`, :const:`~signal.SIGFPE`, + :const:`~signal.SIGABRT`, :const:`~signal.SIGBUS` and + :const:`~signal.SIGILL` signals to dump the Python traceback. + This is equivalent to :option:`-X` ``faulthandler`` option. .. versionadded:: 3.3 From 035f9e0cc5ecda552a7326186cabbcd4dda5408d Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 14 Oct 2023 08:40:18 +0200 Subject: [PATCH 077/265] [3.11] gh-107705: Fix file leak in test_tkinter in the C locale (GH-110507) (GH-110858) (cherry picked from commit ca0f3d858d069231ce7c5b382790a774f385b467) Co-authored-by: Serhiy Storchaka --- Lib/tkinter/test/test_tkinter/test_images.py | 15 ++++++++++----- 1 file changed, 10 insertions(+), 5 deletions(-) diff --git a/Lib/tkinter/test/test_tkinter/test_images.py b/Lib/tkinter/test/test_tkinter/test_images.py index 5b3566a3611eb5..add9c8d6638e47 100644 --- a/Lib/tkinter/test/test_tkinter/test_images.py +++ b/Lib/tkinter/test/test_tkinter/test_images.py @@ -357,13 +357,18 @@ def test_get(self): self.assertRaises(tkinter.TclError, image.get, 15, 16) def test_write(self): + filename = os_helper.TESTFN + import locale + if locale.getlocale()[0] is None: + # Tcl uses Latin1 in the C locale + filename = os_helper.TESTFN_ASCII image = self.create() - self.addCleanup(os_helper.unlink, os_helper.TESTFN) + self.addCleanup(os_helper.unlink, filename) - image.write(os_helper.TESTFN) + image.write(filename) image2 = tkinter.PhotoImage('::img::test2', master=self.root, format='ppm', - file=os_helper.TESTFN) + file=filename) self.assertEqual(str(image2), '::img::test2') self.assertEqual(image2.type(), 'photo') self.assertEqual(image2.width(), 16) @@ -371,10 +376,10 @@ def test_write(self): self.assertEqual(image2.get(0, 0), image.get(0, 0)) self.assertEqual(image2.get(15, 8), image.get(15, 8)) - image.write(os_helper.TESTFN, format='gif', from_coords=(4, 6, 6, 9)) + image.write(filename, format='gif', from_coords=(4, 6, 6, 9)) image3 = tkinter.PhotoImage('::img::test3', master=self.root, format='gif', - file=os_helper.TESTFN) + file=filename) self.assertEqual(str(image3), '::img::test3') self.assertEqual(image3.type(), 'photo') self.assertEqual(image3.width(), 2) From 1c26f1ce6c3965b1f7395b60bf45d7fc0bc6de57 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 14 Oct 2023 08:51:24 +0200 Subject: [PATCH 078/265] [3.11] gh-109747: Improve errors for unsupported look-behind patterns (GH-109859) (GH-110860) Now re.error is raised instead of OverflowError or RuntimeError for too large width of look-behind pattern. The limit is increased to 2**32-1 (was 2**31-1). (cherry picked from commit e2b3d831fd2824d8a5713e3ed2a64aad0fb6b62d) Co-authored-by: Serhiy Storchaka --- Lib/re/_compiler.py | 4 +++- Lib/re/_parser.py | 13 ++++++++--- Lib/test/test_re.py | 23 +++++++++++++++++++ ...-09-25-20-05-41.gh-issue-109747._cRJH8.rst | 3 +++ Modules/_sre/sre.c | 2 -- Modules/_sre/sre_lib.h | 14 +++++------ 6 files changed, 46 insertions(+), 13 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-09-25-20-05-41.gh-issue-109747._cRJH8.rst diff --git a/Lib/re/_compiler.py b/Lib/re/_compiler.py index d8e0d2fdefdcca..285c21936f2cfa 100644 --- a/Lib/re/_compiler.py +++ b/Lib/re/_compiler.py @@ -149,6 +149,8 @@ def _compile(code, pattern, flags): emit(0) # look ahead else: lo, hi = av[1].getwidth() + if lo > MAXCODE: + raise error("looks too much behind") if lo != hi: raise error("look-behind requires fixed-width pattern") emit(lo) # look behind @@ -549,7 +551,7 @@ def _compile_info(code, pattern, flags): else: emit(MAXCODE) prefix = prefix[:MAXCODE] - emit(min(hi, MAXCODE)) + emit(hi) # add literal prefix if prefix: emit(len(prefix)) # length diff --git a/Lib/re/_parser.py b/Lib/re/_parser.py index 6b715f54032f91..c795a7fb00fc97 100644 --- a/Lib/re/_parser.py +++ b/Lib/re/_parser.py @@ -68,6 +68,10 @@ TYPE_FLAGS = SRE_FLAG_ASCII | SRE_FLAG_LOCALE | SRE_FLAG_UNICODE GLOBAL_FLAGS = SRE_FLAG_DEBUG | SRE_FLAG_TEMPLATE +# Maximal value returned by SubPattern.getwidth(). +# Must be larger than MAXREPEAT, MAXCODE and sys.maxsize. +MAXWIDTH = 1 << 64 + class State: # keeps track of state for parsing def __init__(self): @@ -178,7 +182,7 @@ def getwidth(self): lo = hi = 0 for op, av in self.data: if op is BRANCH: - i = MAXREPEAT - 1 + i = MAXWIDTH j = 0 for av in av[1]: l, h = av.getwidth() @@ -197,7 +201,10 @@ def getwidth(self): elif op in _REPEATCODES: i, j = av[2].getwidth() lo = lo + i * av[0] - hi = hi + j * av[1] + if av[1] == MAXREPEAT and j: + hi = MAXWIDTH + else: + hi = hi + j * av[1] elif op in _UNITCODES: lo = lo + 1 hi = hi + 1 @@ -217,7 +224,7 @@ def getwidth(self): hi = hi + j elif op is SUCCESS: break - self.width = min(lo, MAXREPEAT - 1), min(hi, MAXREPEAT) + self.width = min(lo, MAXWIDTH), min(hi, MAXWIDTH) return self.width class Tokenizer: diff --git a/Lib/test/test_re.py b/Lib/test/test_re.py index b41bcd49385453..3c02d005b49a2a 100644 --- a/Lib/test/test_re.py +++ b/Lib/test/test_re.py @@ -1830,6 +1830,29 @@ def test_repeat_minmax_overflow(self): self.assertRaises(OverflowError, re.compile, r".{%d,}?" % 2**128) self.assertRaises(OverflowError, re.compile, r".{%d,%d}" % (2**129, 2**128)) + def test_look_behind_overflow(self): + string = "x" * 2_500_000 + p1 = r"(?<=((.{%d}){%d}){%d})" + p2 = r"(?)', diff --git a/Misc/NEWS.d/next/Library/2023-09-25-20-05-41.gh-issue-109747._cRJH8.rst b/Misc/NEWS.d/next/Library/2023-09-25-20-05-41.gh-issue-109747._cRJH8.rst new file mode 100644 index 00000000000000..b64ba627897a1a --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-09-25-20-05-41.gh-issue-109747._cRJH8.rst @@ -0,0 +1,3 @@ +Improve errors for unsupported look-behind patterns. Now re.error is raised +instead of OverflowError or RuntimeError for too large width of look-behind +pattern. diff --git a/Modules/_sre/sre.c b/Modules/_sre/sre.c index 81629f0007e6a1..3bad9d8d0ffa28 100644 --- a/Modules/_sre/sre.c +++ b/Modules/_sre/sre.c @@ -1939,8 +1939,6 @@ _validate_inner(SRE_CODE *code, SRE_CODE *end, Py_ssize_t groups) GET_SKIP; GET_ARG; /* 0 for lookahead, width for lookbehind */ code--; /* Back up over arg to simplify math below */ - if (arg & 0x80000000) - FAIL; /* Width too large */ /* Stop 1 before the end; we check the SUCCESS below */ if (_validate_inner(code+1, code+skip-2, groups)) FAIL; diff --git a/Modules/_sre/sre_lib.h b/Modules/_sre/sre_lib.h index f8d556b2db85c0..95c1ada908d222 100644 --- a/Modules/_sre/sre_lib.h +++ b/Modules/_sre/sre_lib.h @@ -589,8 +589,8 @@ SRE(match)(SRE_STATE* state, const SRE_CODE* pattern, int toplevel) /* optimization info block */ /* <1=skip> <2=flags> <3=min> ... */ if (pattern[3] && (uintptr_t)(end - ptr) < pattern[3]) { - TRACE(("reject (got %zd chars, need %zd)\n", - end - ptr, (Py_ssize_t) pattern[3])); + TRACE(("reject (got %tu chars, need %zu)\n", + end - ptr, (size_t) pattern[3])); RETURN_FAILURE; } pattern += pattern[1] + 1; @@ -1507,7 +1507,7 @@ SRE(match)(SRE_STATE* state, const SRE_CODE* pattern, int toplevel) /* */ TRACE(("|%p|%p|ASSERT %d\n", pattern, ptr, pattern[1])); - if (ptr - (SRE_CHAR *)state->beginning < (Py_ssize_t)pattern[1]) + if ((uintptr_t)(ptr - (SRE_CHAR *)state->beginning) < pattern[1]) RETURN_FAILURE; state->ptr = ptr - pattern[1]; DO_JUMP0(JUMP_ASSERT, jump_assert, pattern+2); @@ -1520,7 +1520,7 @@ SRE(match)(SRE_STATE* state, const SRE_CODE* pattern, int toplevel) /* */ TRACE(("|%p|%p|ASSERT_NOT %d\n", pattern, ptr, pattern[1])); - if (ptr - (SRE_CHAR *)state->beginning >= (Py_ssize_t)pattern[1]) { + if ((uintptr_t)(ptr - (SRE_CHAR *)state->beginning) >= pattern[1]) { state->ptr = ptr - pattern[1]; LASTMARK_SAVE(); if (state->repeat) @@ -1655,9 +1655,9 @@ SRE(search)(SRE_STATE* state, SRE_CODE* pattern) flags = pattern[2]; - if (pattern[3] && end - ptr < (Py_ssize_t)pattern[3]) { - TRACE(("reject (got %u chars, need %u)\n", - (unsigned int)(end - ptr), pattern[3])); + if (pattern[3] && (uintptr_t)(end - ptr) < pattern[3]) { + TRACE(("reject (got %tu chars, need %zu)\n", + end - ptr, (size_t) pattern[3])); return 0; } if (pattern[3] > 1) { From 44558a9ba8f5097f4c0ee0c3b28423c4850fb4f1 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 14 Oct 2023 16:28:23 +0200 Subject: [PATCH 079/265] [3.11] gh-101100: Fix sphinx warnings in `library/time.rst` (GH-110862) (#110878) gh-101100: Fix sphinx warnings in `library/time.rst` (GH-110862) (cherry picked from commit 12deda763359d46d4eccbb8991afed71fa31a68b) Co-authored-by: Nikita Sobolev --- Doc/library/time.rst | 93 ++++++++++++++++++++++++++++---------------- Doc/tools/.nitignore | 1 - 2 files changed, 60 insertions(+), 34 deletions(-) diff --git a/Doc/library/time.rst b/Doc/library/time.rst index 9f23a6fc7d5341..93eceed29d2b5b 100644 --- a/Doc/library/time.rst +++ b/Doc/library/time.rst @@ -71,8 +71,8 @@ An explanation of some terminology and conventions is in order. * On the other hand, the precision of :func:`.time` and :func:`sleep` is better than their Unix equivalents: times are expressed as floating point numbers, :func:`.time` returns the most accurate time available (using Unix - :c:func:`gettimeofday` where available), and :func:`sleep` will accept a time - with a nonzero fraction (Unix :c:func:`select` is used to implement this, where + :c:func:`!gettimeofday` where available), and :func:`sleep` will accept a time + with a nonzero fraction (Unix :c:func:`!select` is used to implement this, where available). * The time value as returned by :func:`gmtime`, :func:`localtime`, and @@ -84,12 +84,14 @@ An explanation of some terminology and conventions is in order. See :class:`struct_time` for a description of these objects. .. versionchanged:: 3.3 - The :class:`struct_time` type was extended to provide the :attr:`tm_gmtoff` - and :attr:`tm_zone` attributes when platform supports corresponding + The :class:`struct_time` type was extended to provide + the :attr:`~struct_time.tm_gmtoff` and :attr:`~struct_time.tm_zone` + attributes when platform supports corresponding ``struct tm`` members. .. versionchanged:: 3.6 - The :class:`struct_time` attributes :attr:`tm_gmtoff` and :attr:`tm_zone` + The :class:`struct_time` attributes + :attr:`~struct_time.tm_gmtoff` and :attr:`~struct_time.tm_zone` are now available on all platforms. * Use the following functions to convert between time representations: @@ -496,6 +498,8 @@ Functions When used with the :func:`strptime` function, the ``%p`` directive only affects the output hour field if the ``%I`` directive is used to parse the hour. + .. _leap-second: + (2) The range really is ``0`` to ``61``; value ``60`` is valid in timestamps representing `leap seconds`_ and value ``61`` is supported @@ -566,32 +570,55 @@ Functions tuple` interface: values can be accessed by index and by attribute name. The following values are present: - +-------+-------------------+---------------------------------+ - | Index | Attribute | Values | - +=======+===================+=================================+ - | 0 | :attr:`tm_year` | (for example, 1993) | - +-------+-------------------+---------------------------------+ - | 1 | :attr:`tm_mon` | range [1, 12] | - +-------+-------------------+---------------------------------+ - | 2 | :attr:`tm_mday` | range [1, 31] | - +-------+-------------------+---------------------------------+ - | 3 | :attr:`tm_hour` | range [0, 23] | - +-------+-------------------+---------------------------------+ - | 4 | :attr:`tm_min` | range [0, 59] | - +-------+-------------------+---------------------------------+ - | 5 | :attr:`tm_sec` | range [0, 61]; see **(2)** in | - | | | :func:`strftime` description | - +-------+-------------------+---------------------------------+ - | 6 | :attr:`tm_wday` | range [0, 6], Monday is 0 | - +-------+-------------------+---------------------------------+ - | 7 | :attr:`tm_yday` | range [1, 366] | - +-------+-------------------+---------------------------------+ - | 8 | :attr:`tm_isdst` | 0, 1 or -1; see below | - +-------+-------------------+---------------------------------+ - | N/A | :attr:`tm_zone` | abbreviation of timezone name | - +-------+-------------------+---------------------------------+ - | N/A | :attr:`tm_gmtoff` | offset east of UTC in seconds | - +-------+-------------------+---------------------------------+ + .. list-table:: + + * - Index + - Attribute + - Values + + * - 0 + - .. attribute:: tm_year + - (for example, 1993) + + * - 1 + - .. attribute:: tm_mon + - range [1, 12] + + * - 2 + - .. attribute:: tm_day + - range [1, 31] + + * - 3 + - .. attribute:: tm_hour + - range [0, 23] + + * - 4 + - .. attribute:: tm_min + - range [0, 59] + + * - 5 + - .. attribute:: tm_sec + - range [0, 61]; see :ref:`Note (2) ` in :func:`strftime` + + * - 6 + - .. attribute:: tm_wday + - range [0, 6]; Monday is 0 + + * - 7 + - .. attribute:: tm_yday + - range [1, 366] + + * - 8 + - .. attribute:: tm_isdst + - 0, 1 or -1; see below + + * - N/A + - .. attribute:: tm_zone + - abbreviation of timezone name + + * - N/A + - .. attribute:: tm_gmtoff + - offset east of UTC in seconds Note that unlike the C structure, the month value is a range of [1, 12], not [0, 11]. @@ -912,8 +939,8 @@ Timezone Constants For the above Timezone constants (:data:`altzone`, :data:`daylight`, :data:`timezone`, and :data:`tzname`), the value is determined by the timezone rules in effect at module load time or the last time :func:`tzset` is called and may be incorrect - for times in the past. It is recommended to use the :attr:`tm_gmtoff` and - :attr:`tm_zone` results from :func:`localtime` to obtain timezone information. + for times in the past. It is recommended to use the :attr:`~struct_time.tm_gmtoff` and + :attr:`~struct_time.tm_zone` results from :func:`localtime` to obtain timezone information. .. seealso:: diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index 728d378a72ab5e..0ff65ccb21d378 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -121,7 +121,6 @@ Doc/library/tarfile.rst Doc/library/tempfile.rst Doc/library/termios.rst Doc/library/test.rst -Doc/library/time.rst Doc/library/tkinter.rst Doc/library/tkinter.scrolledtext.rst Doc/library/tkinter.ttk.rst From 5c55f50a71008af5fe9ebf998722297a0591ed86 Mon Sep 17 00:00:00 2001 From: Serhiy Storchaka Date: Sun, 15 Oct 2023 11:32:26 +0300 Subject: [PATCH 080/265] [3.11] [3.12] bpo-42663: Fix parsing TZ strings in zoneinfo module (GH-23825) (GH-110882) (GH-110889) zipinfo now supports the full range of values in the TZ string determined by RFC 8536 and detects all invalid formats. Both Python and C implementations now raise exceptions of the same type on invalid data. (cherry picked from commit ab08ff7882b6181fb785eed7410dbf8030aded70) (cherry picked from commit 72b0f0eaf52ac46d5f6e165f737d6f891cf8d172) --- Lib/test/test_zoneinfo/test_zoneinfo.py | 125 +++++- Lib/zoneinfo/_zoneinfo.py | 86 ++-- ...2-05-06-15-49-57.gh-issue-86826.rf006W.rst | 4 + Modules/_zoneinfo.c | 371 +++++++----------- 4 files changed, 327 insertions(+), 259 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2022-05-06-15-49-57.gh-issue-86826.rf006W.rst diff --git a/Lib/test/test_zoneinfo/test_zoneinfo.py b/Lib/test/test_zoneinfo/test_zoneinfo.py index fc19247b262e01..0b6ab09eff357c 100644 --- a/Lib/test/test_zoneinfo/test_zoneinfo.py +++ b/Lib/test/test_zoneinfo/test_zoneinfo.py @@ -988,6 +988,80 @@ def test_tzstr_from_utc(self): self.assertEqual(dt_act, dt_utc) + def test_extreme_tzstr(self): + tzstrs = [ + # Extreme offset hour + "AAA24", + "AAA+24", + "AAA-24", + "AAA24BBB,J60/2,J300/2", + "AAA+24BBB,J60/2,J300/2", + "AAA-24BBB,J60/2,J300/2", + "AAA4BBB24,J60/2,J300/2", + "AAA4BBB+24,J60/2,J300/2", + "AAA4BBB-24,J60/2,J300/2", + # Extreme offset minutes + "AAA4:00BBB,J60/2,J300/2", + "AAA4:59BBB,J60/2,J300/2", + "AAA4BBB5:00,J60/2,J300/2", + "AAA4BBB5:59,J60/2,J300/2", + # Extreme offset seconds + "AAA4:00:00BBB,J60/2,J300/2", + "AAA4:00:59BBB,J60/2,J300/2", + "AAA4BBB5:00:00,J60/2,J300/2", + "AAA4BBB5:00:59,J60/2,J300/2", + # Extreme total offset + "AAA24:59:59BBB5,J60/2,J300/2", + "AAA-24:59:59BBB5,J60/2,J300/2", + "AAA4BBB24:59:59,J60/2,J300/2", + "AAA4BBB-24:59:59,J60/2,J300/2", + # Extreme months + "AAA4BBB,M12.1.1/2,M1.1.1/2", + "AAA4BBB,M1.1.1/2,M12.1.1/2", + # Extreme weeks + "AAA4BBB,M1.5.1/2,M1.1.1/2", + "AAA4BBB,M1.1.1/2,M1.5.1/2", + # Extreme weekday + "AAA4BBB,M1.1.6/2,M2.1.1/2", + "AAA4BBB,M1.1.1/2,M2.1.6/2", + # Extreme numeric offset + "AAA4BBB,0/2,20/2", + "AAA4BBB,0/2,0/14", + "AAA4BBB,20/2,365/2", + "AAA4BBB,365/2,365/14", + # Extreme julian offset + "AAA4BBB,J1/2,J20/2", + "AAA4BBB,J1/2,J1/14", + "AAA4BBB,J20/2,J365/2", + "AAA4BBB,J365/2,J365/14", + # Extreme transition hour + "AAA4BBB,J60/167,J300/2", + "AAA4BBB,J60/+167,J300/2", + "AAA4BBB,J60/-167,J300/2", + "AAA4BBB,J60/2,J300/167", + "AAA4BBB,J60/2,J300/+167", + "AAA4BBB,J60/2,J300/-167", + # Extreme transition minutes + "AAA4BBB,J60/2:00,J300/2", + "AAA4BBB,J60/2:59,J300/2", + "AAA4BBB,J60/2,J300/2:00", + "AAA4BBB,J60/2,J300/2:59", + # Extreme transition seconds + "AAA4BBB,J60/2:00:00,J300/2", + "AAA4BBB,J60/2:00:59,J300/2", + "AAA4BBB,J60/2,J300/2:00:00", + "AAA4BBB,J60/2,J300/2:00:59", + # Extreme total transition time + "AAA4BBB,J60/167:59:59,J300/2", + "AAA4BBB,J60/-167:59:59,J300/2", + "AAA4BBB,J60/2,J300/167:59:59", + "AAA4BBB,J60/2,J300/-167:59:59", + ] + + for tzstr in tzstrs: + with self.subTest(tzstr=tzstr): + self.zone_from_tzstr(tzstr) + def test_invalid_tzstr(self): invalid_tzstrs = [ "PST8PDT", # DST but no transition specified @@ -995,16 +1069,33 @@ def test_invalid_tzstr(self): "GMT,M3.2.0/2,M11.1.0/3", # Transition rule but no DST "GMT0+11,M3.2.0/2,M11.1.0/3", # Unquoted alphanumeric in DST "PST8PDT,M3.2.0/2", # Only one transition rule - # Invalid offsets - "STD+25", - "STD-25", - "STD+374", - "STD+374DST,M3.2.0/2,M11.1.0/3", - "STD+23DST+25,M3.2.0/2,M11.1.0/3", - "STD-23DST-25,M3.2.0/2,M11.1.0/3", + # Invalid offset hours + "AAA168", + "AAA+168", + "AAA-168", + "AAA168BBB,J60/2,J300/2", + "AAA+168BBB,J60/2,J300/2", + "AAA-168BBB,J60/2,J300/2", + "AAA4BBB168,J60/2,J300/2", + "AAA4BBB+168,J60/2,J300/2", + "AAA4BBB-168,J60/2,J300/2", + # Invalid offset minutes + "AAA4:0BBB,J60/2,J300/2", + "AAA4:100BBB,J60/2,J300/2", + "AAA4BBB5:0,J60/2,J300/2", + "AAA4BBB5:100,J60/2,J300/2", + # Invalid offset seconds + "AAA4:00:0BBB,J60/2,J300/2", + "AAA4:00:100BBB,J60/2,J300/2", + "AAA4BBB5:00:0,J60/2,J300/2", + "AAA4BBB5:00:100,J60/2,J300/2", # Completely invalid dates "AAA4BBB,M1443339,M11.1.0/3", "AAA4BBB,M3.2.0/2,0349309483959c", + "AAA4BBB,,J300/2", + "AAA4BBB,z,J300/2", + "AAA4BBB,J60/2,", + "AAA4BBB,J60/2,z", # Invalid months "AAA4BBB,M13.1.1/2,M1.1.1/2", "AAA4BBB,M1.1.1/2,M13.1.1/2", @@ -1024,6 +1115,26 @@ def test_invalid_tzstr(self): # Invalid julian offset "AAA4BBB,J0/2,J20/2", "AAA4BBB,J20/2,J366/2", + # Invalid transition time + "AAA4BBB,J60/2/3,J300/2", + "AAA4BBB,J60/2,J300/2/3", + # Invalid transition hour + "AAA4BBB,J60/168,J300/2", + "AAA4BBB,J60/+168,J300/2", + "AAA4BBB,J60/-168,J300/2", + "AAA4BBB,J60/2,J300/168", + "AAA4BBB,J60/2,J300/+168", + "AAA4BBB,J60/2,J300/-168", + # Invalid transition minutes + "AAA4BBB,J60/2:0,J300/2", + "AAA4BBB,J60/2:100,J300/2", + "AAA4BBB,J60/2,J300/2:0", + "AAA4BBB,J60/2,J300/2:100", + # Invalid transition seconds + "AAA4BBB,J60/2:00:0,J300/2", + "AAA4BBB,J60/2:00:100,J300/2", + "AAA4BBB,J60/2,J300/2:00:0", + "AAA4BBB,J60/2,J300/2:00:100", ] for invalid_tzstr in invalid_tzstrs: diff --git a/Lib/zoneinfo/_zoneinfo.py b/Lib/zoneinfo/_zoneinfo.py index eede15b8271058..b77dc0ed391bb3 100644 --- a/Lib/zoneinfo/_zoneinfo.py +++ b/Lib/zoneinfo/_zoneinfo.py @@ -517,8 +517,8 @@ class _DayOffset: __slots__ = ["d", "julian", "hour", "minute", "second"] def __init__(self, d, julian, hour=2, minute=0, second=0): - if not (0 + julian) <= d <= 365: - min_day = 0 + julian + min_day = 0 + julian # convert bool to int + if not min_day <= d <= 365: raise ValueError(f"d must be in [{min_day}, 365], not: {d}") self.d = d @@ -560,11 +560,11 @@ class _CalendarOffset: ) def __init__(self, m, w, d, hour=2, minute=0, second=0): - if not 0 < m <= 12: - raise ValueError("m must be in (0, 12]") + if not 1 <= m <= 12: + raise ValueError("m must be in [1, 12]") - if not 0 < w <= 5: - raise ValueError("w must be in (0, 5]") + if not 1 <= w <= 5: + raise ValueError("w must be in [1, 5]") if not 0 <= d <= 6: raise ValueError("d must be in [0, 6]") @@ -634,18 +634,21 @@ def _parse_tz_str(tz_str): offset_str, *start_end_str = tz_str.split(",", 1) - # fmt: off parser_re = re.compile( - r"(?P[^<0-9:.+-]+|<[a-zA-Z0-9+\-]+>)" + - r"((?P[+-]?\d{1,2}(:\d{2}(:\d{2})?)?)" + - r"((?P[^0-9:.+-]+|<[a-zA-Z0-9+\-]+>)" + - r"((?P[+-]?\d{1,2}(:\d{2}(:\d{2})?)?))?" + - r")?" + # dst - r")?$" # stdoff + r""" + (?P[^<0-9:.+-]+|<[a-zA-Z0-9+-]+>) + (?: + (?P[+-]?\d{1,3}(?::\d{2}(?::\d{2})?)?) + (?: + (?P[^0-9:.+-]+|<[a-zA-Z0-9+-]+>) + (?P[+-]?\d{1,3}(?::\d{2}(?::\d{2})?)?)? + )? # dst + )? # stdoff + """, + re.ASCII|re.VERBOSE ) - # fmt: on - m = parser_re.match(offset_str) + m = parser_re.fullmatch(offset_str) if m is None: raise ValueError(f"{tz_str} is not a valid TZ string") @@ -696,16 +699,17 @@ def _parse_tz_str(tz_str): def _parse_dst_start_end(dststr): - date, *time = dststr.split("/") - if date[0] == "M": + date, *time = dststr.split("/", 1) + type = date[:1] + if type == "M": n_is_julian = False - m = re.match(r"M(\d{1,2})\.(\d).(\d)$", date) + m = re.fullmatch(r"M(\d{1,2})\.(\d).(\d)", date, re.ASCII) if m is None: raise ValueError(f"Invalid dst start/end date: {dststr}") date_offset = tuple(map(int, m.groups())) offset = _CalendarOffset(*date_offset) else: - if date[0] == "J": + if type == "J": n_is_julian = True date = date[1:] else: @@ -715,38 +719,54 @@ def _parse_dst_start_end(dststr): offset = _DayOffset(doy, n_is_julian) if time: - time_components = list(map(int, time[0].split(":"))) - n_components = len(time_components) - if n_components < 3: - time_components.extend([0] * (3 - n_components)) - offset.hour, offset.minute, offset.second = time_components + offset.hour, offset.minute, offset.second = _parse_transition_time(time[0]) return offset +def _parse_transition_time(time_str): + match = re.fullmatch( + r"(?P[+-])?(?P\d{1,3})(:(?P\d{2})(:(?P\d{2}))?)?", + time_str, + re.ASCII + ) + if match is None: + raise ValueError(f"Invalid time: {time_str}") + + h, m, s = (int(v or 0) for v in match.group("h", "m", "s")) + + if h > 167: + raise ValueError( + f"Hour must be in [0, 167]: {time_str}" + ) + + if match.group("sign") == "-": + h, m, s = -h, -m, -s + + return h, m, s + + def _parse_tz_delta(tz_delta): - match = re.match( - r"(?P[+-])?(?P\d{1,2})(:(?P\d{2})(:(?P\d{2}))?)?", + match = re.fullmatch( + r"(?P[+-])?(?P\d{1,3})(:(?P\d{2})(:(?P\d{2}))?)?", tz_delta, + re.ASCII ) # Anything passed to this function should already have hit an equivalent # regular expression to find the section to parse. assert match is not None, tz_delta - h, m, s = ( - int(v) if v is not None else 0 - for v in map(match.group, ("h", "m", "s")) - ) + h, m, s = (int(v or 0) for v in match.group("h", "m", "s")) total = h * 3600 + m * 60 + s - if not -86400 < total < 86400: + if h > 24: raise ValueError( - f"Offset must be strictly between -24h and +24h: {tz_delta}" + f"Offset hours must be in [0, 24]: {tz_delta}" ) # Yes, +5 maps to an offset of -5h if match.group("sign") != "-": - total *= -1 + total = -total return total diff --git a/Misc/NEWS.d/next/Library/2022-05-06-15-49-57.gh-issue-86826.rf006W.rst b/Misc/NEWS.d/next/Library/2022-05-06-15-49-57.gh-issue-86826.rf006W.rst new file mode 100644 index 00000000000000..02cd75eec4be9e --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-06-15-49-57.gh-issue-86826.rf006W.rst @@ -0,0 +1,4 @@ +:mod:`zipinfo` now supports the full range of values in the TZ string +determined by RFC 8536 and detects all invalid formats. +Both Python and C implementations now raise exceptions of the same +type on invalid data. diff --git a/Modules/_zoneinfo.c b/Modules/_zoneinfo.c index 5f584ff93e9a59..e2ed67c026c2a5 100644 --- a/Modules/_zoneinfo.c +++ b/Modules/_zoneinfo.c @@ -59,21 +59,21 @@ struct TransitionRuleType { typedef struct { TransitionRuleType base; - uint8_t month; - uint8_t week; - uint8_t day; - int8_t hour; - int8_t minute; - int8_t second; + uint8_t month; /* 1 - 12 */ + uint8_t week; /* 1 - 5 */ + uint8_t day; /* 0 - 6 */ + int16_t hour; /* -167 - 167, RFC 8536 §3.3.1 */ + int8_t minute; /* signed 2 digits */ + int8_t second; /* signed 2 digits */ } CalendarRule; typedef struct { TransitionRuleType base; - uint8_t julian; - unsigned int day; - int8_t hour; - int8_t minute; - int8_t second; + uint8_t julian; /* 0, 1 */ + uint16_t day; /* 0 - 365 */ + int16_t hour; /* -167 - 167, RFC 8536 §3.3.1 */ + int8_t minute; /* signed 2 digits */ + int8_t second; /* signed 2 digits */ } DayRule; struct StrongCacheNode { @@ -122,15 +122,14 @@ ts_to_local(size_t *trans_idx, int64_t *trans_utc, long *utcoff, static int parse_tz_str(PyObject *tz_str_obj, _tzrule *out); -static Py_ssize_t -parse_abbr(const char *const p, PyObject **abbr); -static Py_ssize_t -parse_tz_delta(const char *const p, long *total_seconds); -static Py_ssize_t -parse_transition_time(const char *const p, int8_t *hour, int8_t *minute, - int8_t *second); -static Py_ssize_t -parse_transition_rule(const char *const p, TransitionRuleType **out); +static int +parse_abbr(const char **p, PyObject **abbr); +static int +parse_tz_delta(const char **p, long *total_seconds); +static int +parse_transition_time(const char **p, int *hour, int *minute, int *second); +static int +parse_transition_rule(const char **p, TransitionRuleType **out); static _ttinfo * find_tzrule_ttinfo(_tzrule *rule, int64_t ts, unsigned char fold, int year); @@ -1212,14 +1211,14 @@ calendarrule_year_to_timestamp(TransitionRuleType *base_self, int year) } int64_t ordinal = ymd_to_ord(year, self->month, month_day) - EPOCHORDINAL; - return ((ordinal * 86400) + (int64_t)(self->hour * 3600) + - (int64_t)(self->minute * 60) + (int64_t)(self->second)); + return ordinal * 86400 + (int64_t)self->hour * 3600 + + (int64_t)self->minute * 60 + self->second; } /* Constructor for CalendarRule. */ int -calendarrule_new(uint8_t month, uint8_t week, uint8_t day, int8_t hour, - int8_t minute, int8_t second, CalendarRule *out) +calendarrule_new(int month, int week, int day, int hour, + int minute, int second, CalendarRule *out) { // These bounds come from the POSIX standard, which describes an Mm.n.d // rule as: @@ -1228,33 +1227,36 @@ calendarrule_new(uint8_t month, uint8_t week, uint8_t day, int8_t hour, // 5, 1 <= m <= 12, where week 5 means "the last d day in month m" which // may occur in either the fourth or the fifth week). Week 1 is the first // week in which the d'th day occurs. Day zero is Sunday. - if (month <= 0 || month > 12) { - PyErr_Format(PyExc_ValueError, "Month must be in (0, 12]"); + if (month < 1 || month > 12) { + PyErr_Format(PyExc_ValueError, "Month must be in [1, 12]"); return -1; } - if (week <= 0 || week > 5) { - PyErr_Format(PyExc_ValueError, "Week must be in (0, 5]"); + if (week < 1 || week > 5) { + PyErr_Format(PyExc_ValueError, "Week must be in [1, 5]"); return -1; } - // If the 'day' parameter type is changed to a signed type, - // "day < 0" check must be added. - if (/* day < 0 || */ day > 6) { + if (day < 0 || day > 6) { PyErr_Format(PyExc_ValueError, "Day must be in [0, 6]"); return -1; } + if (hour < -167 || hour > 167) { + PyErr_Format(PyExc_ValueError, "Hour must be in [0, 167]"); + return -1; + } + TransitionRuleType base = {&calendarrule_year_to_timestamp}; CalendarRule new_offset = { .base = base, - .month = month, - .week = week, - .day = day, - .hour = hour, - .minute = minute, - .second = second, + .month = (uint8_t)month, + .week = (uint8_t)week, + .day = (uint8_t)day, + .hour = (int16_t)hour, + .minute = (int8_t)minute, + .second = (int8_t)second, }; *out = new_offset; @@ -1294,40 +1296,45 @@ dayrule_year_to_timestamp(TransitionRuleType *base_self, int year) // always transitions on a given calendar day (other than February 29th), // you would use a Julian day, e.g. J91 always refers to April 1st and J365 // always refers to December 31st. - unsigned int day = self->day; + uint16_t day = self->day; if (self->julian && day >= 59 && is_leap_year(year)) { day += 1; } - return ((days_before_year + day) * 86400) + (self->hour * 3600) + - (self->minute * 60) + self->second; + return (days_before_year + day) * 86400 + (int64_t)self->hour * 3600 + + (int64_t)self->minute * 60 + self->second; } /* Constructor for DayRule. */ static int -dayrule_new(uint8_t julian, unsigned int day, int8_t hour, int8_t minute, - int8_t second, DayRule *out) +dayrule_new(int julian, int day, int hour, int minute, + int second, DayRule *out) { // The POSIX standard specifies that Julian days must be in the range (1 <= // n <= 365) and that non-Julian (they call it "0-based Julian") days must // be in the range (0 <= n <= 365). if (day < julian || day > 365) { - PyErr_Format(PyExc_ValueError, "day must be in [%u, 365], not: %u", + PyErr_Format(PyExc_ValueError, "day must be in [%d, 365], not: %d", julian, day); return -1; } + if (hour < -167 || hour > 167) { + PyErr_Format(PyExc_ValueError, "Hour must be in [0, 167]"); + return -1; + } + TransitionRuleType base = { &dayrule_year_to_timestamp, }; DayRule tmp = { .base = base, - .julian = julian, - .day = day, - .hour = hour, - .minute = minute, - .second = second, + .julian = (uint8_t)julian, + .day = (int16_t)day, + .hour = (int16_t)hour, + .minute = (int8_t)minute, + .second = (int8_t)second, }; *out = tmp; @@ -1484,21 +1491,18 @@ parse_tz_str(PyObject *tz_str_obj, _tzrule *out) const char *p = tz_str; // Read the `std` abbreviation, which must be at least 3 characters long. - Py_ssize_t num_chars = parse_abbr(p, &std_abbr); - if (num_chars < 1) { - PyErr_Format(PyExc_ValueError, "Invalid STD format in %R", tz_str_obj); + if (parse_abbr(&p, &std_abbr)) { + if (!PyErr_Occurred()) { + PyErr_Format(PyExc_ValueError, "Invalid STD format in %R", tz_str_obj); + } goto error; } - p += num_chars; - // Now read the STD offset, which is required - num_chars = parse_tz_delta(p, &std_offset); - if (num_chars < 0) { + if (parse_tz_delta(&p, &std_offset)) { PyErr_Format(PyExc_ValueError, "Invalid STD offset in %R", tz_str_obj); goto error; } - p += num_chars; // If the string ends here, there is no DST, otherwise we must parse the // DST abbreviation and start and end dates and times. @@ -1506,12 +1510,12 @@ parse_tz_str(PyObject *tz_str_obj, _tzrule *out) goto complete; } - num_chars = parse_abbr(p, &dst_abbr); - if (num_chars < 1) { - PyErr_Format(PyExc_ValueError, "Invalid DST format in %R", tz_str_obj); + if (parse_abbr(&p, &dst_abbr)) { + if (!PyErr_Occurred()) { + PyErr_Format(PyExc_ValueError, "Invalid DST format in %R", tz_str_obj); + } goto error; } - p += num_chars; if (*p == ',') { // From the POSIX standard: @@ -1521,14 +1525,11 @@ parse_tz_str(PyObject *tz_str_obj, _tzrule *out) dst_offset = std_offset + 3600; } else { - num_chars = parse_tz_delta(p, &dst_offset); - if (num_chars < 0) { + if (parse_tz_delta(&p, &dst_offset)) { PyErr_Format(PyExc_ValueError, "Invalid DST offset in %R", tz_str_obj); goto error; } - - p += num_chars; } TransitionRuleType **transitions[2] = {&start, &end}; @@ -1541,14 +1542,12 @@ parse_tz_str(PyObject *tz_str_obj, _tzrule *out) } p++; - num_chars = parse_transition_rule(p, transitions[i]); - if (num_chars < 0) { + if (parse_transition_rule(&p, transitions[i])) { PyErr_Format(PyExc_ValueError, "Malformed transition rule in TZ string: %R", tz_str_obj); goto error; } - p += num_chars; } if (*p != '\0') { @@ -1582,26 +1581,30 @@ parse_tz_str(PyObject *tz_str_obj, _tzrule *out) } static int -parse_uint(const char *const p, uint8_t *value) +parse_digits(const char **p, int min, int max, int *value) { - if (!isdigit(*p)) { - return -1; + assert(max <= 3); + *value = 0; + for (int i = 0; i < max; i++, (*p)++) { + if (!Py_ISDIGIT(**p)) { + return (i < min) ? -1 : 0; + } + *value *= 10; + *value += (**p) - '0'; } - - *value = (*p) - '0'; return 0; } /* Parse the STD and DST abbreviations from a TZ string. */ -static Py_ssize_t -parse_abbr(const char *const p, PyObject **abbr) +static int +parse_abbr(const char **p, PyObject **abbr) { - const char *ptr = p; - char buff = *ptr; + const char *ptr = *p; const char *str_start; const char *str_end; if (*ptr == '<') { + char buff; ptr++; str_start = ptr; while ((buff = *ptr) != '>') { @@ -1625,7 +1628,7 @@ parse_abbr(const char *const p, PyObject **abbr) ptr++; } else { - str_start = p; + str_start = ptr; // From the POSIX standard: // // In the unquoted form, all characters in these fields shall be @@ -1635,6 +1638,9 @@ parse_abbr(const char *const p, PyObject **abbr) ptr++; } str_end = ptr; + if (str_end == str_start) { + return -1; + } } *abbr = PyUnicode_FromStringAndSize(str_start, str_end - str_start); @@ -1642,12 +1648,13 @@ parse_abbr(const char *const p, PyObject **abbr) return -1; } - return ptr - p; + *p = ptr; + return 0; } /* Parse a UTC offset from a TZ str. */ -static Py_ssize_t -parse_tz_delta(const char *const p, long *total_seconds) +static int +parse_tz_delta(const char **p, long *total_seconds) { // From the POSIX spec: // @@ -1662,75 +1669,30 @@ parse_tz_delta(const char *const p, long *total_seconds) // The POSIX spec says that the values for `hour` must be between 0 and 24 // hours, but RFC 8536 §3.3.1 specifies that the hours part of the // transition times may be signed and range from -167 to 167. - long sign = -1; - long hours = 0; - long minutes = 0; - long seconds = 0; - - const char *ptr = p; - char buff = *ptr; - if (buff == '-' || buff == '+') { - // Negative numbers correspond to *positive* offsets, from the spec: - // - // If preceded by a '-', the timezone shall be east of the Prime - // Meridian; otherwise, it shall be west (which may be indicated by - // an optional preceding '+' ). - if (buff == '-') { - sign = 1; - } - - ptr++; - } - - // The hour can be 1 or 2 numeric characters - for (size_t i = 0; i < 2; ++i) { - buff = *ptr; - if (!isdigit(buff)) { - if (i == 0) { - return -1; - } - else { - break; - } - } + int hours = 0; + int minutes = 0; + int seconds = 0; - hours *= 10; - hours += buff - '0'; - ptr++; - } - - if (hours > 24 || hours < 0) { + if (parse_transition_time(p, &hours, &minutes, &seconds)) { return -1; } - // Minutes and seconds always of the format ":dd" - long *outputs[2] = {&minutes, &seconds}; - for (size_t i = 0; i < 2; ++i) { - if (*ptr != ':') { - goto complete; - } - ptr++; - - for (size_t j = 0; j < 2; ++j) { - buff = *ptr; - if (!isdigit(buff)) { - return -1; - } - *(outputs[i]) *= 10; - *(outputs[i]) += buff - '0'; - ptr++; - } + if (hours > 24 || hours < -24) { + return -1; } -complete: - *total_seconds = sign * ((hours * 3600) + (minutes * 60) + seconds); - - return ptr - p; + // Negative numbers correspond to *positive* offsets, from the spec: + // + // If preceded by a '-', the timezone shall be east of the Prime + // Meridian; otherwise, it shall be west (which may be indicated by + // an optional preceding '+' ). + *total_seconds = -((hours * 3600L) + (minutes * 60) + seconds); + return 0; } /* Parse the date portion of a transition rule. */ -static Py_ssize_t -parse_transition_rule(const char *const p, TransitionRuleType **out) +static int +parse_transition_rule(const char **p, TransitionRuleType **out) { // The full transition rule indicates when to change back and forth between // STD and DST, and has the form: @@ -1742,10 +1704,10 @@ parse_transition_rule(const char *const p, TransitionRuleType **out) // does not include the ',' at the end of the first rule. // // The POSIX spec states that if *time* is not given, the default is 02:00. - const char *ptr = p; - int8_t hour = 2; - int8_t minute = 0; - int8_t second = 0; + const char *ptr = *p; + int hour = 2; + int minute = 0; + int second = 0; // Rules come in one of three flavors: // @@ -1754,44 +1716,30 @@ parse_transition_rule(const char *const p, TransitionRuleType **out) // 3. Mm.n.d: Specifying by month, week and day-of-week. if (*ptr == 'M') { - uint8_t month, week, day; + int month, week, day; ptr++; - if (parse_uint(ptr, &month)) { + + if (parse_digits(&ptr, 1, 2, &month)) { return -1; } - ptr++; - if (*ptr != '.') { - uint8_t tmp; - if (parse_uint(ptr, &tmp)) { - return -1; - } - - month *= 10; - month += tmp; - ptr++; + if (*ptr++ != '.') { + return -1; } - - uint8_t *values[2] = {&week, &day}; - for (size_t i = 0; i < 2; ++i) { - if (*ptr != '.') { - return -1; - } - ptr++; - - if (parse_uint(ptr, values[i])) { - return -1; - } - ptr++; + if (parse_digits(&ptr, 1, 1, &week)) { + return -1; + } + if (*ptr++ != '.') { + return -1; + } + if (parse_digits(&ptr, 1, 1, &day)) { + return -1; } if (*ptr == '/') { ptr++; - Py_ssize_t num_chars = - parse_transition_time(ptr, &hour, &minute, &second); - if (num_chars < 0) { + if (parse_transition_time(&ptr, &hour, &minute, &second)) { return -1; } - ptr += num_chars; } CalendarRule *rv = PyMem_Calloc(1, sizeof(CalendarRule)); @@ -1807,33 +1755,22 @@ parse_transition_rule(const char *const p, TransitionRuleType **out) *out = (TransitionRuleType *)rv; } else { - uint8_t julian = 0; - unsigned int day = 0; + int julian = 0; + int day = 0; if (*ptr == 'J') { julian = 1; ptr++; } - for (size_t i = 0; i < 3; ++i) { - if (!isdigit(*ptr)) { - if (i == 0) { - return -1; - } - break; - } - day *= 10; - day += (*ptr) - '0'; - ptr++; + if (parse_digits(&ptr, 1, 3, &day)) { + return -1; } if (*ptr == '/') { ptr++; - Py_ssize_t num_chars = - parse_transition_time(ptr, &hour, &minute, &second); - if (num_chars < 0) { + if (parse_transition_time(&ptr, &hour, &minute, &second)) { return -1; } - ptr += num_chars; } DayRule *rv = PyMem_Calloc(1, sizeof(DayRule)); @@ -1848,13 +1785,13 @@ parse_transition_rule(const char *const p, TransitionRuleType **out) *out = (TransitionRuleType *)rv; } - return ptr - p; + *p = ptr; + return 0; } /* Parse the time portion of a transition rule (e.g. following an /) */ -static Py_ssize_t -parse_transition_time(const char *const p, int8_t *hour, int8_t *minute, - int8_t *second) +static int +parse_transition_time(const char **p, int *hour, int *minute, int *second) { // From the spec: // @@ -1866,12 +1803,9 @@ parse_transition_time(const char *const p, int8_t *hour, int8_t *minute, // h[h][:mm[:ss]] // // RFC 8536 also allows transition times to be signed and to range from - // -167 to +167, but the current version only supports [0, 99]. - // - // TODO: Support the full range of transition hours. - int8_t *components[3] = {hour, minute, second}; - const char *ptr = p; - int8_t sign = 1; + // -167 to +167. + const char *ptr = *p; + int sign = 1; if (*ptr == '-' || *ptr == '+') { if (*ptr == '-') { @@ -1880,32 +1814,31 @@ parse_transition_time(const char *const p, int8_t *hour, int8_t *minute, ptr++; } - for (size_t i = 0; i < 3; ++i) { - if (i > 0) { - if (*ptr != ':') { - break; - } - ptr++; + // The hour can be 1 to 3 numeric characters + if (parse_digits(&ptr, 1, 3, hour)) { + return -1; + } + *hour *= sign; + + // Minutes and seconds always of the format ":dd" + if (*ptr == ':') { + ptr++; + if (parse_digits(&ptr, 2, 2, minute)) { + return -1; } + *minute *= sign; - uint8_t buff = 0; - for (size_t j = 0; j < 2; j++) { - if (!isdigit(*ptr)) { - if (i == 0 && j > 0) { - break; - } + if (*ptr == ':') { + ptr++; + if (parse_digits(&ptr, 2, 2, second)) { return -1; } - - buff *= 10; - buff += (*ptr) - '0'; - ptr++; + *second *= sign; } - - *(components[i]) = sign * buff; } - return ptr - p; + *p = ptr; + return 0; } /* Constructor for a _tzrule. @@ -2260,8 +2193,8 @@ get_local_timestamp(PyObject *dt, int64_t *local_ts) } } - *local_ts = (int64_t)(ord - EPOCHORDINAL) * 86400 + - (int64_t)(hour * 3600 + minute * 60 + second); + *local_ts = (int64_t)(ord - EPOCHORDINAL) * 86400L + + (int64_t)(hour * 3600L + minute * 60 + second); return 0; } From fd1b314cd3fc58e9155306e2d907492ed272c567 Mon Sep 17 00:00:00 2001 From: Thomas Grainger Date: Sun, 15 Oct 2023 07:51:13 -0700 Subject: [PATCH 081/265] [3.11] remove redundant call to attach_loop in watcher (GH-110847) (#110870) (cherry picked from commit 596589104fe5a4d90cb145b2cc69b71cc9aa9f07) --- Lib/asyncio/unix_events.py | 2 -- 1 file changed, 2 deletions(-) diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py index ac4519acc4307b..0d4ba72603e675 100644 --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -1443,8 +1443,6 @@ def _init_watcher(self): with events._lock: if self._watcher is None: # pragma: no branch self._watcher = ThreadedChildWatcher() - if threading.current_thread() is threading.main_thread(): - self._watcher.attach_loop(self._local._loop) def set_event_loop(self, loop): """Set the event loop. From e60abeea30c004cfe0dd619554e013ad76cf1dc2 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 15 Oct 2023 19:00:57 +0200 Subject: [PATCH 082/265] [3.11] gh-110886 Doc: add a link to BNF Wikipedia article (GH-110887) (#110901) Co-authored-by: partev Co-authored-by: Hugo van Kemenade --- Doc/reference/introduction.rst | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/Doc/reference/introduction.rst b/Doc/reference/introduction.rst index 81f0a5c5d43883..cf186705e6e987 100644 --- a/Doc/reference/introduction.rst +++ b/Doc/reference/introduction.rst @@ -90,7 +90,8 @@ Notation .. index:: BNF, grammar, syntax, notation -The descriptions of lexical analysis and syntax use a modified BNF grammar +The descriptions of lexical analysis and syntax use a modified +`Backus–Naur form (BNF) `_ grammar notation. This uses the following style of definition: .. productionlist:: notation From 616862d58e41da4493619d8dac8b3dc372a1d789 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 16 Oct 2023 15:15:07 +0200 Subject: [PATCH 083/265] [3.11] gh-110527: Improve `PySet_Clear` docs (GH-110528) (#110927) gh-110527: Improve `PySet_Clear` docs (GH-110528) (cherry picked from commit bfc1cd8145db00df23fbbd2ed95324bb96c0b25b) Co-authored-by: Nikita Sobolev --- Doc/c-api/set.rst | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/Doc/c-api/set.rst b/Doc/c-api/set.rst index 1e8a09509032f5..09c0fb6b9c5f23 100644 --- a/Doc/c-api/set.rst +++ b/Doc/c-api/set.rst @@ -163,4 +163,6 @@ subtypes but not for instances of :class:`frozenset` or its subtypes. .. c:function:: int PySet_Clear(PyObject *set) - Empty an existing set of all elements. + Empty an existing set of all elements. Return ``0`` on + success. Return ``-1`` and raise :exc:`SystemError` if *set* is not an instance of + :class:`set` or its subtype. From 6502a130e46a24267b9655e69e2300243c510dec Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 16 Oct 2023 16:55:52 +0200 Subject: [PATCH 084/265] [3.11] regrtest: Prepend 'use' options in --{fast,slow}-ci (GH-110363) (#110924) regrtest: Prepend 'use' options in --{fast,slow}-ci (GH-110363) This allows individual resources to be disabled without having to explicitly re-enable all others. (cherry picked from commit b75186f69edcf54615910a5cd707996144163ef7) Co-authored-by: Zachary Ware --- Lib/test/libregrtest/cmdline.py | 10 ++++++---- Lib/test/test_regrtest.py | 8 +++++--- 2 files changed, 11 insertions(+), 7 deletions(-) diff --git a/Lib/test/libregrtest/cmdline.py b/Lib/test/libregrtest/cmdline.py index dd4cd335bef7e3..b1b00893402d8e 100644 --- a/Lib/test/libregrtest/cmdline.py +++ b/Lib/test/libregrtest/cmdline.py @@ -418,14 +418,16 @@ def _parse_args(args, **kwargs): # --slow-ci has the priority if ns.slow_ci: # Similar to: -u "all" --timeout=1200 - if not ns.use: - ns.use = [['all']] + if ns.use is None: + ns.use = [] + ns.use.insert(0, ['all']) if ns.timeout is None: ns.timeout = 1200 # 20 minutes elif ns.fast_ci: # Similar to: -u "all,-cpu" --timeout=600 - if not ns.use: - ns.use = [['all', '-cpu']] + if ns.use is None: + ns.use = [] + ns.use.insert(0, ['all', '-cpu']) if ns.timeout is None: ns.timeout = 600 # 10 minutes diff --git a/Lib/test/test_regrtest.py b/Lib/test/test_regrtest.py index 45573a2719c543..c8e182397c835e 100644 --- a/Lib/test/test_regrtest.py +++ b/Lib/test/test_regrtest.py @@ -416,9 +416,11 @@ def test_fast_ci_python_cmd(self): self.assertEqual(regrtest.python_cmd, ('python', '-X', 'dev')) def test_fast_ci_resource(self): - # it should be possible to override resources - args = ['--fast-ci', '-u', 'network'] - use_resources = ['network'] + # it should be possible to override resources individually + args = ['--fast-ci', '-u-network'] + use_resources = sorted(cmdline.ALL_RESOURCES) + use_resources.remove('cpu') + use_resources.remove('network') self.check_ci_mode(args, use_resources) def test_slow_ci(self): From e8eb2bf7884292907878dca05082f3db9c31791e Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 16 Oct 2023 21:06:34 +0200 Subject: [PATCH 085/265] [3.11] C-API docs: Clarify the size of arenas (GH-110895) (#110947) C-API docs: Clarify the size of arenas (GH-110895) Clarify the size of arenas From 3.10.0 alpha 7, the pymalloc allocator uses arenas with a fixed size of 1 MiB on 64-bit platforms instead of 256 KiB on 32-bit platforms. (cherry picked from commit f07ca27709855d4637b43bba23384cc795143ee3) Co-authored-by: Mienxiu <82512658+mienxiu@users.noreply.github.com> --- Doc/c-api/memory.rst | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/Doc/c-api/memory.rst b/Doc/c-api/memory.rst index 87b8603c4c0b10..27005f92dd43e5 100644 --- a/Doc/c-api/memory.rst +++ b/Doc/c-api/memory.rst @@ -620,7 +620,8 @@ The pymalloc allocator Python has a *pymalloc* allocator optimized for small objects (smaller or equal to 512 bytes) with a short lifetime. It uses memory mappings called "arenas" -with a fixed size of 256 KiB. It falls back to :c:func:`PyMem_RawMalloc` and +with a fixed size of either 256 KiB on 32-bit platforms or 1 MiB on 64-bit +platforms. It falls back to :c:func:`PyMem_RawMalloc` and :c:func:`PyMem_RawRealloc` for allocations larger than 512 bytes. *pymalloc* is the :ref:`default allocator ` of the From 5f7aba938cf5007b6f95472a3a54732dd99a2df8 Mon Sep 17 00:00:00 2001 From: Alex Waygood Date: Mon, 16 Oct 2023 21:17:59 +0100 Subject: [PATCH 086/265] [3.11] Enable ruff on several more files in `Lib/test` (#110929) (#110935) (cherry-picked from commit 02d26c4bef3ad0f9c97e47993a7fa67898842e5c) --- Lib/ctypes/test/test_arrays.py | 10 +++++----- Lib/ctypes/test/test_functions.py | 6 +++--- Lib/test/.ruff.toml | 3 --- Lib/test/test_genericclass.py | 4 ++-- Lib/test/test_keywordonlyarg.py | 2 +- Lib/test/test_subclassinit.py | 10 +++++----- 6 files changed, 16 insertions(+), 19 deletions(-) diff --git a/Lib/ctypes/test/test_arrays.py b/Lib/ctypes/test/test_arrays.py index 14603b7049c92c..a6538779611d77 100644 --- a/Lib/ctypes/test/test_arrays.py +++ b/Lib/ctypes/test/test_arrays.py @@ -178,10 +178,10 @@ def test_bad_subclass(self): class T(Array): pass with self.assertRaises(AttributeError): - class T(Array): + class T2(Array): _type_ = c_int with self.assertRaises(AttributeError): - class T(Array): + class T3(Array): _length_ = 13 def test_bad_length(self): @@ -190,15 +190,15 @@ class T(Array): _type_ = c_int _length_ = - sys.maxsize * 2 with self.assertRaises(ValueError): - class T(Array): + class T2(Array): _type_ = c_int _length_ = -1 with self.assertRaises(TypeError): - class T(Array): + class T3(Array): _type_ = c_int _length_ = 1.87 with self.assertRaises(OverflowError): - class T(Array): + class T4(Array): _type_ = c_int _length_ = sys.maxsize * 2 diff --git a/Lib/ctypes/test/test_functions.py b/Lib/ctypes/test/test_functions.py index fc571700ce3be3..d1fbc32a41ea2c 100644 --- a/Lib/ctypes/test/test_functions.py +++ b/Lib/ctypes/test/test_functions.py @@ -42,16 +42,16 @@ class X(object, Array): from _ctypes import _Pointer with self.assertRaises(TypeError): - class X(object, _Pointer): + class X2(object, _Pointer): pass from _ctypes import _SimpleCData with self.assertRaises(TypeError): - class X(object, _SimpleCData): + class X3(object, _SimpleCData): _type_ = "i" with self.assertRaises(TypeError): - class X(object, Structure): + class X4(object, Structure): _fields_ = [] @need_symbol('c_wchar') diff --git a/Lib/test/.ruff.toml b/Lib/test/.ruff.toml index 7ef54160ec6a81..89e33b2d63c8f6 100644 --- a/Lib/test/.ruff.toml +++ b/Lib/test/.ruff.toml @@ -15,12 +15,9 @@ extend-exclude = [ "test_descr.py", "test_enum.py", "test_functools.py", - "test_genericclass.py", "test_grammar.py", "test_import/__init__.py", - "test_keywordonlyarg.py", "test_pkg.py", - "test_subclassinit.py", "test_tokenize.py", "test_yield_from.py", "time_hashlib.py", diff --git a/Lib/test/test_genericclass.py b/Lib/test/test_genericclass.py index d8bb37f69e18a1..aece757fc1933e 100644 --- a/Lib/test/test_genericclass.py +++ b/Lib/test/test_genericclass.py @@ -98,7 +98,7 @@ def __mro_entries__(self): return () d = C_too_few() with self.assertRaises(TypeError): - class D(d): ... + class E(d): ... def test_mro_entry_errors_2(self): class C_not_callable: @@ -111,7 +111,7 @@ def __mro_entries__(self): return object c = C_not_tuple() with self.assertRaises(TypeError): - class D(c): ... + class E(c): ... def test_mro_entry_metaclass(self): meta_args = [] diff --git a/Lib/test/test_keywordonlyarg.py b/Lib/test/test_keywordonlyarg.py index df82f677a00a48..918f953cae5702 100644 --- a/Lib/test/test_keywordonlyarg.py +++ b/Lib/test/test_keywordonlyarg.py @@ -170,7 +170,7 @@ def f(v=a, x=b, *, y=c, z=d): pass self.assertEqual(str(err.exception), "name 'b' is not defined") with self.assertRaises(NameError) as err: - f = lambda v=a, x=b, *, y=c, z=d: None + g = lambda v=a, x=b, *, y=c, z=d: None self.assertEqual(str(err.exception), "name 'b' is not defined") diff --git a/Lib/test/test_subclassinit.py b/Lib/test/test_subclassinit.py index 0ad7d17fbd4ddd..de3e4bb958a570 100644 --- a/Lib/test/test_subclassinit.py +++ b/Lib/test/test_subclassinit.py @@ -232,7 +232,7 @@ def __init__(self, name, bases, namespace, otherarg): super().__init__(name, bases, namespace) with self.assertRaises(TypeError): - class MyClass(metaclass=MyMeta, otherarg=1): + class MyClass2(metaclass=MyMeta, otherarg=1): pass class MyMeta(type): @@ -243,10 +243,10 @@ def __init__(self, name, bases, namespace, otherarg): super().__init__(name, bases, namespace) self.otherarg = otherarg - class MyClass(metaclass=MyMeta, otherarg=1): + class MyClass3(metaclass=MyMeta, otherarg=1): pass - self.assertEqual(MyClass.otherarg, 1) + self.assertEqual(MyClass3.otherarg, 1) def test_errors_changed_pep487(self): # These tests failed before Python 3.6, PEP 487 @@ -265,10 +265,10 @@ def __new__(cls, name, bases, namespace, otherarg): self.otherarg = otherarg return self - class MyClass(metaclass=MyMeta, otherarg=1): + class MyClass2(metaclass=MyMeta, otherarg=1): pass - self.assertEqual(MyClass.otherarg, 1) + self.assertEqual(MyClass2.otherarg, 1) def test_type(self): t = type('NewClass', (object,), {}) From 57ae64f76693b905f5fa72eaf320ad28c3a30440 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 17 Oct 2023 07:29:46 +0200 Subject: [PATCH 087/265] [3.11] Bump sphinx-lint to v0.8.1 (GH-110933) (#110958) Co-authored-by: Alex Waygood --- .pre-commit-config.yaml | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 5d9deaa7ca8547..d69365f85c0507 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -17,12 +17,11 @@ repos: types_or: [c, inc, python, rst] - repo: https://github.com/sphinx-contrib/sphinx-lint - rev: v0.7.0 + rev: v0.8.1 hooks: - id: sphinx-lint - args: [--enable=default-role, -j1] + args: [--enable=default-role] files: ^Doc/|^Misc/NEWS.d/next/ - types: [rst] - repo: meta hooks: From ae495de4e65b2c8fbde16a246537b4484efed255 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 17 Oct 2023 11:59:10 +0200 Subject: [PATCH 088/265] [3.11] gh-110695: test_asyncio uses 50 ms for clock resolution (GH-110952) (#110971) gh-110695: test_asyncio uses 50 ms for clock resolution (GH-110952) Before utils.CLOCK_RES constant was added (20 ms), test_asyncio already used 50 ms. (cherry picked from commit 9a9fba825f8aaee4ea9b3429875c6c6324d0dee0) Co-authored-by: Victor Stinner --- Lib/test/test_asyncio/utils.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_asyncio/utils.py b/Lib/test/test_asyncio/utils.py index f49db4e37b3592..d6f60db10f9b3f 100644 --- a/Lib/test/test_asyncio/utils.py +++ b/Lib/test/test_asyncio/utils.py @@ -38,9 +38,9 @@ # Use the maximum known clock resolution (gh-75191, gh-110088): Windows -# GetTickCount64() has a resolution of 15.6 ms. Use 20 ms to tolerate rounding +# GetTickCount64() has a resolution of 15.6 ms. Use 50 ms to tolerate rounding # issues. -CLOCK_RES = 0.020 +CLOCK_RES = 0.050 def data_file(*filename): From f58bcee53a9ef542d44c2898d743362f3f96dc53 Mon Sep 17 00:00:00 2001 From: Nikita Sobolev Date: Tue, 17 Oct 2023 15:35:00 +0300 Subject: [PATCH 089/265] [3.11] Bump test deps: `ruff` and `pre-commit-hooks` (GH-110972) (#110981) (cherry picked from commit b75b1f389f083db8568bff573c33ab4ecf29655a) --- .pre-commit-config.yaml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index d69365f85c0507..fd2692e490bec3 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -1,6 +1,6 @@ repos: - repo: https://github.com/astral-sh/ruff-pre-commit - rev: v0.0.292 + rev: v0.1.0 hooks: - id: ruff name: Run Ruff on Lib/test/ @@ -8,7 +8,7 @@ repos: files: ^Lib/test/ - repo: https://github.com/pre-commit/pre-commit-hooks - rev: v4.4.0 + rev: v4.5.0 hooks: - id: check-toml exclude: ^Lib/test/test_tomllib/ From 73ebe2f881ac2ffb4d3d23b8693213fea53357e9 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 17 Oct 2023 20:43:46 +0200 Subject: [PATCH 090/265] [3.11] gh-110995: Fix test_gdb check_usable_gdb() (GH-110998) (#111004) gh-110995: Fix test_gdb check_usable_gdb() (GH-110998) Fix detection of gdb built without Python scripting support. * check_usable_gdb() doesn't check gdb exit code when calling run_gdb(). * Use shutil.which() to get the path to the gdb program. (cherry picked from commit 920b3dfacad615c7bb9bd9a35774469f8809b453) Co-authored-by: Victor Stinner --- Lib/test/test_gdb/util.py | 17 +++++++++++------ ...23-10-17-17-54-36.gh-issue-110995.Fx8KRD.rst | 2 ++ 2 files changed, 13 insertions(+), 6 deletions(-) create mode 100644 Misc/NEWS.d/next/Tests/2023-10-17-17-54-36.gh-issue-110995.Fx8KRD.rst diff --git a/Lib/test/test_gdb/util.py b/Lib/test/test_gdb/util.py index 7f4e3cba3534bd..8fe9cfc543395e 100644 --- a/Lib/test/test_gdb/util.py +++ b/Lib/test/test_gdb/util.py @@ -1,6 +1,7 @@ import os import re import shlex +import shutil import subprocess import sys import sysconfig @@ -8,6 +9,8 @@ from test import support +GDB_PROGRAM = shutil.which('gdb') or 'gdb' + # Location of custom hooks file in a repository checkout. CHECKOUT_HOOK_PATH = os.path.join(os.path.dirname(sys.executable), 'python-gdb.py') @@ -27,7 +30,7 @@ def clean_environment(): # Temporary value until it's initialized by get_gdb_version() below GDB_VERSION = (0, 0) -def run_gdb(*args, exitcode=0, **env_vars): +def run_gdb(*args, exitcode=0, check=True, **env_vars): """Runs gdb in --batch mode with the additional arguments given by *args. Returns its (stdout, stderr) decoded from utf-8 using the replace handler. @@ -36,7 +39,7 @@ def run_gdb(*args, exitcode=0, **env_vars): if env_vars: env.update(env_vars) - cmd = ['gdb', + cmd = [GDB_PROGRAM, # Batch mode: Exit after processing all the command files # specified with -x/--command '--batch', @@ -59,7 +62,7 @@ def run_gdb(*args, exitcode=0, **env_vars): stdout = proc.stdout stderr = proc.stderr - if proc.returncode != exitcode: + if check and proc.returncode != exitcode: cmd_text = shlex.join(cmd) raise Exception(f"{cmd_text} failed with exit code {proc.returncode}, " f"expected exit code {exitcode}:\n" @@ -72,10 +75,10 @@ def run_gdb(*args, exitcode=0, **env_vars): def get_gdb_version(): try: stdout, stderr = run_gdb('--version') - except OSError: + except OSError as exc: # This is what "no gdb" looks like. There may, however, be other # errors that manifest this way too. - raise unittest.SkipTest("Couldn't find gdb program on the path") + raise unittest.SkipTest(f"Couldn't find gdb program on the path: {exc}") # Regex to parse: # 'GNU gdb (GDB; SUSE Linux Enterprise 12) 7.7\n' -> 7.7 @@ -106,7 +109,8 @@ def check_usable_gdb(): # disallow this without a customized .gdbinit. stdout, stderr = run_gdb( '--eval-command=python import sys; print(sys.version_info)', - '--args', sys.executable) + '--args', sys.executable, + check=False) if "auto-loading has been declined" in stderr: raise unittest.SkipTest( @@ -144,6 +148,7 @@ def setup_module(): print(f"gdb version {GDB_VERSION[0]}.{GDB_VERSION[1]}:") for line in GDB_VERSION_TEXT.splitlines(): print(" " * 4 + line) + print(f" path: {GDB_PROGRAM}") print() diff --git a/Misc/NEWS.d/next/Tests/2023-10-17-17-54-36.gh-issue-110995.Fx8KRD.rst b/Misc/NEWS.d/next/Tests/2023-10-17-17-54-36.gh-issue-110995.Fx8KRD.rst new file mode 100644 index 00000000000000..db29eaf234b731 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-10-17-17-54-36.gh-issue-110995.Fx8KRD.rst @@ -0,0 +1,2 @@ +test_gdb: Fix detection of gdb built without Python scripting support. Patch +by Victor Stinner. From 1af7b7db0d1cd6756ecb6081364fdd1378b1605c Mon Sep 17 00:00:00 2001 From: Lysandros Nikolaou Date: Wed, 18 Oct 2023 00:34:56 +0200 Subject: [PATCH 091/265] [3.11] gh-107450: Check for overflow in the tokenizer and fix overflow test (GH-110832) (#110939) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit (cherry picked from commit a1ac5590e0f8fe008e5562d22edab65d0c1c5507) Co-authored-by: Filipe Laíns Co-authored-by: Serhiy Storchaka --- Include/errcode.h | 39 +++++++++++++++++++------------------ Lib/test/test_exceptions.py | 16 +++++++++++---- Parser/pegen_errors.c | 10 ++++------ Parser/tokenizer.c | 4 ++++ 4 files changed, 40 insertions(+), 29 deletions(-) diff --git a/Include/errcode.h b/Include/errcode.h index 54ae929bf25870..19ee83ec73d8cd 100644 --- a/Include/errcode.h +++ b/Include/errcode.h @@ -4,7 +4,6 @@ extern "C" { #endif - /* Error codes passed around between file input, tokenizer, parser and interpreter. This is necessary so we can turn them into Python exceptions at a higher level. Note that some errors have a @@ -13,24 +12,26 @@ extern "C" { the parser only returns E_EOF when it hits EOF immediately, and it never returns E_OK. */ -#define E_OK 10 /* No error */ -#define E_EOF 11 /* End Of File */ -#define E_INTR 12 /* Interrupted */ -#define E_TOKEN 13 /* Bad token */ -#define E_SYNTAX 14 /* Syntax error */ -#define E_NOMEM 15 /* Ran out of memory */ -#define E_DONE 16 /* Parsing complete */ -#define E_ERROR 17 /* Execution error */ -#define E_TABSPACE 18 /* Inconsistent mixing of tabs and spaces */ -#define E_OVERFLOW 19 /* Node had too many children */ -#define E_TOODEEP 20 /* Too many indentation levels */ -#define E_DEDENT 21 /* No matching outer block for dedent */ -#define E_DECODE 22 /* Error in decoding into Unicode */ -#define E_EOFS 23 /* EOF in triple-quoted string */ -#define E_EOLS 24 /* EOL in single-quoted string */ -#define E_LINECONT 25 /* Unexpected characters after a line continuation */ -#define E_BADSINGLE 27 /* Ill-formed single statement input */ -#define E_INTERACT_STOP 28 /* Interactive mode stopped tokenization */ +#define E_OK 10 /* No error */ +#define E_EOF 11 /* End Of File */ +#define E_INTR 12 /* Interrupted */ +#define E_TOKEN 13 /* Bad token */ +#define E_SYNTAX 14 /* Syntax error */ +#define E_NOMEM 15 /* Ran out of memory */ +#define E_DONE 16 /* Parsing complete */ +#define E_ERROR 17 /* Execution error */ +#define E_TABSPACE 18 /* Inconsistent mixing of tabs and spaces */ +#define E_OVERFLOW 19 /* Node had too many children */ +#define E_TOODEEP 20 /* Too many indentation levels */ +#define E_DEDENT 21 /* No matching outer block for dedent */ +#define E_DECODE 22 /* Error in decoding into Unicode */ +#define E_EOFS 23 /* EOF in triple-quoted string */ +#define E_EOLS 24 /* EOL in single-quoted string */ +#define E_LINECONT 25 /* Unexpected characters after a line continuation */ +#define E_BADSINGLE 27 /* Ill-formed single statement input */ +#define E_INTERACT_STOP 28 /* Interactive mode stopped tokenization */ +#define E_COLUMNOVERFLOW 29 /* Column offset overflow */ + #ifdef __cplusplus } diff --git a/Lib/test/test_exceptions.py b/Lib/test/test_exceptions.py index c7c63e5e4ae29d..cd064219c671e0 100644 --- a/Lib/test/test_exceptions.py +++ b/Lib/test/test_exceptions.py @@ -19,6 +19,12 @@ from test.support.warnings_helper import check_warnings from test import support +try: + from _testcapi import INT_MAX +except ImportError: + INT_MAX = 2**31 - 1 + + class NaiveException(Exception): def __init__(self, x): @@ -318,11 +324,13 @@ def baz(): check('(yield i) = 2', 1, 2) check('def f(*):\n pass', 1, 7) + @unittest.skipIf(INT_MAX >= sys.maxsize, "Downcasting to int is safe for col_offset") @support.requires_resource('cpu') - @support.bigmemtest(support._2G, memuse=1.5) - def testMemoryErrorBigSource(self, _size): - with self.assertRaises(OverflowError): - exec(f"if True:\n {' ' * 2**31}print('hello world')") + @support.bigmemtest(INT_MAX, memuse=2, dry_run=False) + def testMemoryErrorBigSource(self, size): + src = b"if True:\n%*s" % (size, b"pass") + with self.assertRaisesRegex(OverflowError, "Parser column offset overflow"): + compile(src, '', 'exec') @cpython_only def testSettingException(self): diff --git a/Parser/pegen_errors.c b/Parser/pegen_errors.c index ea5c4e227ff763..005356110a4340 100644 --- a/Parser/pegen_errors.c +++ b/Parser/pegen_errors.c @@ -101,6 +101,10 @@ _Pypegen_tokenizer_error(Parser *p) msg = "unexpected character after line continuation character"; break; } + case E_COLUMNOVERFLOW: + PyErr_SetString(PyExc_OverflowError, + "Parser column offset overflow - source line is too big"); + return -1; default: msg = "unknown parsing error"; } @@ -224,12 +228,6 @@ _PyPegen_raise_error(Parser *p, PyObject *errtype, const char *errmsg, ...) col_offset = 0; } else { const char* start = p->tok->buf ? p->tok->line_start : p->tok->buf; - if (p->tok->cur - start > INT_MAX) { - PyErr_SetString(PyExc_OverflowError, - "Parser column offset overflow - source line is too big"); - p->error_indicator = 1; - return NULL; - } col_offset = Py_SAFE_DOWNCAST(p->tok->cur - start, intptr_t, int); } } else { diff --git a/Parser/tokenizer.c b/Parser/tokenizer.c index 7fc8a585621d3a..566ad8dfa00e0c 100644 --- a/Parser/tokenizer.c +++ b/Parser/tokenizer.c @@ -1057,6 +1057,10 @@ tok_nextc(struct tok_state *tok) int rc; for (;;) { if (tok->cur != tok->inp) { + if (tok->cur - tok->buf >= INT_MAX) { + tok->done = E_COLUMNOVERFLOW; + return EOF; + } return Py_CHARMASK(*tok->cur++); /* Fast path */ } if (tok->done != E_OK) { From 4ccd418a024d15d986a01588f935f43248c2f05d Mon Sep 17 00:00:00 2001 From: Victor Stinner Date: Wed, 18 Oct 2023 01:20:36 +0200 Subject: [PATCH 092/265] [3.11] gh-110756: Fix libregrtest clear_caches() for distutils (#111011) gh-110756: Fix libregrtest clear_caches() for distutils Restore code removed by recent sync with the main branch which no longer has distutils: commit 26748ed4f61520c59af15547792d1e73144a4314. --- Lib/test/libregrtest/utils.py | 9 +++++++++ 1 file changed, 9 insertions(+) diff --git a/Lib/test/libregrtest/utils.py b/Lib/test/libregrtest/utils.py index b5bbe0ef6596a7..f62184ad4fec6c 100644 --- a/Lib/test/libregrtest/utils.py +++ b/Lib/test/libregrtest/utils.py @@ -184,6 +184,15 @@ def clear_caches(): if stream is not None: stream.flush() + # Clear assorted module caches. + # Don't worry about resetting the cache if the module is not loaded + try: + distutils_dir_util = sys.modules['distutils.dir_util'] + except KeyError: + pass + else: + distutils_dir_util._path_created.clear() + try: re = sys.modules['re'] except KeyError: From 72131d0610baf5d486b83f79bdb63a75aa22b8f1 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 18 Oct 2023 06:30:39 +0200 Subject: [PATCH 093/265] [3.11] Regen Doc/requirements-oldest-sphinx.txt (GH-111012) (#111021) Regen Doc/requirements-oldest-sphinx.txt (GH-111012) Fix https://github.com/python/cpython/security/dependabot/4: use urllib3 version 2.0.7. (cherry picked from commit e7ae43ad7dde74e731a9d258e372d17f3b2eb893) Co-authored-by: Victor Stinner --- Doc/requirements-oldest-sphinx.txt | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/Doc/requirements-oldest-sphinx.txt b/Doc/requirements-oldest-sphinx.txt index 5de739fc10b085..597341d99ffeaa 100644 --- a/Doc/requirements-oldest-sphinx.txt +++ b/Doc/requirements-oldest-sphinx.txt @@ -16,7 +16,6 @@ alabaster==0.7.13 Babel==2.13.0 certifi==2023.7.22 charset-normalizer==3.3.0 -colorama==0.4.6 docutils==0.17.1 idna==3.4 imagesize==1.4.1 @@ -33,4 +32,4 @@ sphinxcontrib-htmlhelp==2.0.1 sphinxcontrib-jsmath==1.0.1 sphinxcontrib-qthelp==1.0.3 sphinxcontrib-serializinghtml==1.1.5 -urllib3==2.0.6 +urllib3==2.0.7 From 50d936a125b17ff31b527347c0f4c37aa85efb83 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 18 Oct 2023 10:04:39 +0200 Subject: [PATCH 094/265] [3.11] gh-111019: Align expected and actual titles in test output (GH-111020) (#111025) gh-111019: Align expected and actual titles in test output (GH-111020) Align expected and actual titles in output from assert_has_calls/assert_called_with for greater readability (cherry picked from commit 77dbd956090aac66e264d9d640f6adb6b0930b87) Co-authored-by: James --- Lib/unittest/mock.py | 6 +++--- Lib/unittest/test/testmock/testmock.py | 14 +++++++------- 2 files changed, 10 insertions(+), 10 deletions(-) diff --git a/Lib/unittest/mock.py b/Lib/unittest/mock.py index e78eb350b8c0d5..3c96f1e864932a 100644 --- a/Lib/unittest/mock.py +++ b/Lib/unittest/mock.py @@ -822,7 +822,7 @@ def _format_mock_call_signature(self, args, kwargs): def _format_mock_failure_message(self, args, kwargs, action='call'): - message = 'expected %s not found.\nExpected: %s\nActual: %s' + message = 'expected %s not found.\nExpected: %s\n Actual: %s' expected_string = self._format_mock_call_signature(args, kwargs) call_args = self.call_args actual_string = self._format_mock_call_signature(*call_args) @@ -925,7 +925,7 @@ def assert_called_with(self, /, *args, **kwargs): if self.call_args is None: expected = self._format_mock_call_signature(args, kwargs) actual = 'not called.' - error_message = ('expected call not found.\nExpected: %s\nActual: %s' + error_message = ('expected call not found.\nExpected: %s\n Actual: %s' % (expected, actual)) raise AssertionError(error_message) @@ -976,7 +976,7 @@ def assert_has_calls(self, calls, any_order=False): raise AssertionError( f'{problem}\n' f'Expected: {_CallList(calls)}' - f'{self._calls_repr(prefix="Actual").rstrip(".")}' + f'{self._calls_repr(prefix=" Actual").rstrip(".")}' ) from cause return diff --git a/Lib/unittest/test/testmock/testmock.py b/Lib/unittest/test/testmock/testmock.py index eaae22e854e949..b2b299c14023b9 100644 --- a/Lib/unittest/test/testmock/testmock.py +++ b/Lib/unittest/test/testmock/testmock.py @@ -1039,7 +1039,7 @@ def test_assert_called_with_failure_message(self): actual = 'not called.' expected = "mock(1, '2', 3, bar='foo')" - message = 'expected call not found.\nExpected: %s\nActual: %s' + message = 'expected call not found.\nExpected: %s\n Actual: %s' self.assertRaisesWithMsg( AssertionError, message % (expected, actual), mock.assert_called_with, 1, '2', 3, bar='foo' @@ -1054,7 +1054,7 @@ def test_assert_called_with_failure_message(self): for meth in asserters: actual = "foo(1, '2', 3, foo='foo')" expected = "foo(1, '2', 3, bar='foo')" - message = 'expected call not found.\nExpected: %s\nActual: %s' + message = 'expected call not found.\nExpected: %s\n Actual: %s' self.assertRaisesWithMsg( AssertionError, message % (expected, actual), meth, 1, '2', 3, bar='foo' @@ -1064,7 +1064,7 @@ def test_assert_called_with_failure_message(self): for meth in asserters: actual = "foo(1, '2', 3, foo='foo')" expected = "foo(bar='foo')" - message = 'expected call not found.\nExpected: %s\nActual: %s' + message = 'expected call not found.\nExpected: %s\n Actual: %s' self.assertRaisesWithMsg( AssertionError, message % (expected, actual), meth, bar='foo' @@ -1074,7 +1074,7 @@ def test_assert_called_with_failure_message(self): for meth in asserters: actual = "foo(1, '2', 3, foo='foo')" expected = "foo(1, 2, 3)" - message = 'expected call not found.\nExpected: %s\nActual: %s' + message = 'expected call not found.\nExpected: %s\n Actual: %s' self.assertRaisesWithMsg( AssertionError, message % (expected, actual), meth, 1, 2, 3 @@ -1084,7 +1084,7 @@ def test_assert_called_with_failure_message(self): for meth in asserters: actual = "foo(1, '2', 3, foo='foo')" expected = "foo()" - message = 'expected call not found.\nExpected: %s\nActual: %s' + message = 'expected call not found.\nExpected: %s\n Actual: %s' self.assertRaisesWithMsg( AssertionError, message % (expected, actual), meth ) @@ -1533,7 +1533,7 @@ def f(x=None): pass '^{}$'.format( re.escape('Calls not found.\n' 'Expected: [call()]\n' - 'Actual: [call(1)]'))) as cm: + ' Actual: [call(1)]'))) as cm: mock.assert_has_calls([call()]) self.assertIsNone(cm.exception.__cause__) @@ -1545,7 +1545,7 @@ def f(x=None): pass 'Error processing expected calls.\n' "Errors: [None, TypeError('too many positional arguments')]\n" "Expected: [call(), call(1, 2)]\n" - 'Actual: [call(1)]'))) as cm: + ' Actual: [call(1)]'))) as cm: mock.assert_has_calls([call(), call(1, 2)]) self.assertIsInstance(cm.exception.__cause__, TypeError) From 7c308f4c64dca4093b70b7a5e49f81c4f3679359 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 18 Oct 2023 10:40:39 +0200 Subject: [PATCH 095/265] [3.11] gh-103737: IDLE - Remove unneeded .keys() for dict iteration (GH-110960) (#111027) gh-103737: IDLE - Remove unneeded .keys() for dict iteration (GH-110960) Add comments where .keys() is needed. Leave debugger usages along because situation is unclear as indicated in expanded comment. Most testing is manual. (cherry picked from commit baefbb21d91db2d950706737a6ebee9b2eff5c2d) Co-authored-by: Terry Jan Reedy --- Lib/idlelib/config.py | 10 +++++++--- Lib/idlelib/configdialog.py | 22 +++++++++------------- Lib/idlelib/debugger.py | 2 +- Lib/idlelib/debugobj.py | 3 ++- Lib/idlelib/idle_test/test_config.py | 4 ++-- Lib/idlelib/idle_test/test_debugobj.py | 4 ++-- Lib/idlelib/pyshell.py | 7 ++++--- Lib/idlelib/stackviewer.py | 2 +- 8 files changed, 28 insertions(+), 26 deletions(-) diff --git a/Lib/idlelib/config.py b/Lib/idlelib/config.py index 2b09d79470b47c..898efeb4dd1550 100644 --- a/Lib/idlelib/config.py +++ b/Lib/idlelib/config.py @@ -597,7 +597,9 @@ def GetCoreKeys(self, keySetName=None): problem getting any core binding there will be an 'ultimate last resort fallback' to the CUA-ish bindings defined here. """ + # TODO: = dict(sorted([(v-event, keys), ...]))? keyBindings={ + # vitual-event: list of key events. '<>': ['', ''], '<>': ['', ''], '<>': ['', ''], @@ -880,7 +882,7 @@ def _dump(): # htest # (not really, but ignore in coverage) line, crc = 0, 0 def sprint(obj): - global line, crc + nonlocal line, crc txt = str(obj) line += 1 crc = crc32(txt.encode(encoding='utf-8'), crc) @@ -889,7 +891,7 @@ def sprint(obj): def dumpCfg(cfg): print('\n', cfg, '\n') # Cfg has variable '0xnnnnnnnn' address. - for key in sorted(cfg.keys()): + for key in sorted(cfg): sections = cfg[key].sections() sprint(key) sprint(sections) @@ -908,4 +910,6 @@ def dumpCfg(cfg): from unittest import main main('idlelib.idle_test.test_config', verbosity=2, exit=False) - # Run revised _dump() as htest? + _dump() + # Run revised _dump() (700+ lines) as htest? More sorting. + # Perhaps as window with tabs for textviews, making it config viewer. diff --git a/Lib/idlelib/configdialog.py b/Lib/idlelib/configdialog.py index cda7966d558a51..d56afe8a807fea 100644 --- a/Lib/idlelib/configdialog.py +++ b/Lib/idlelib/configdialog.py @@ -211,14 +211,8 @@ def help(self): contents=help_common+help_pages.get(page, '')) def deactivate_current_config(self): - """Remove current key bindings. - Iterate over window instances defined in parent and remove - the keybindings. - """ - # Before a config is saved, some cleanup of current - # config must be done - remove the previous keybindings. - win_instances = self.parent.instance_dict.keys() - for instance in win_instances: + """Remove current key bindings in current windows.""" + for instance in self.parent.instance_dict: instance.RemoveKeybindings() def activate_config_changes(self): @@ -227,8 +221,7 @@ def activate_config_changes(self): Dynamically update the current parent window instances with some of the configuration changes. """ - win_instances = self.parent.instance_dict.keys() - for instance in win_instances: + for instance in self.parent.instance_dict: instance.ResetColorizer() instance.ResetFont() instance.set_notabs_indentwidth() @@ -583,6 +576,8 @@ def create_page_highlight(self): (*)theme_message: Label """ self.theme_elements = { + # Display_name: ('internal_name, sort_number'). + # TODO: remove sort_number unneeded with dict ordering. 'Normal Code or Text': ('normal', '00'), 'Code Context': ('context', '01'), 'Python Keywords': ('keyword', '02'), @@ -765,7 +760,7 @@ def load_theme_cfg(self): self.builtinlist.SetMenu(item_list, item_list[0]) self.set_theme_type() # Load theme element option menu. - theme_names = list(self.theme_elements.keys()) + theme_names = list(self.theme_elements) theme_names.sort(key=lambda x: self.theme_elements[x][1]) self.targetlist.SetMenu(theme_names, theme_names[0]) self.paint_theme_sample() @@ -1477,12 +1472,13 @@ def load_keys_list(self, keyset_name): reselect = True list_index = self.bindingslist.index(ANCHOR) keyset = idleConf.GetKeySet(keyset_name) - bind_names = list(keyset.keys()) + # 'set' is dict mapping virtual event to list of key events. + bind_names = list(keyset) bind_names.sort() self.bindingslist.delete(0, END) for bind_name in bind_names: key = ' '.join(keyset[bind_name]) - bind_name = bind_name[2:-2] # Trim off the angle brackets. + bind_name = bind_name[2:-2] # Trim double angle brackets. if keyset_name in changes['keys']: # Handle any unsaved changes to this key set. if bind_name in changes['keys'][keyset_name]: diff --git a/Lib/idlelib/debugger.py b/Lib/idlelib/debugger.py index 452c62b42655b3..a92bb98d908d46 100644 --- a/Lib/idlelib/debugger.py +++ b/Lib/idlelib/debugger.py @@ -509,7 +509,7 @@ def load_dict(self, dict, force=0, rpc_client=None): # There is also an obscure bug in sorted(dict) where the # interpreter gets into a loop requesting non-existing dict[0], # dict[1], dict[2], etc from the debugger_r.DictProxy. - ### + # TODO recheck above; see debugger_r 159ff, debugobj 60. keys_list = dict.keys() names = sorted(keys_list) ### diff --git a/Lib/idlelib/debugobj.py b/Lib/idlelib/debugobj.py index 71d01c7070df54..032b686f379378 100644 --- a/Lib/idlelib/debugobj.py +++ b/Lib/idlelib/debugobj.py @@ -93,7 +93,8 @@ def setfunction(value, key=key, object=self.object): class DictTreeItem(SequenceTreeItem): def keys(self): - keys = list(self.object.keys()) + # TODO return sorted(self.object) + keys = list(self.object) try: keys.sort() except: diff --git a/Lib/idlelib/idle_test/test_config.py b/Lib/idlelib/idle_test/test_config.py index 08ed76fe288294..a746f1538a62b0 100644 --- a/Lib/idlelib/idle_test/test_config.py +++ b/Lib/idlelib/idle_test/test_config.py @@ -274,8 +274,8 @@ def test_create_config_handlers(self): conf.CreateConfigHandlers() # Check keys are equal - self.assertCountEqual(conf.defaultCfg.keys(), conf.config_types) - self.assertCountEqual(conf.userCfg.keys(), conf.config_types) + self.assertCountEqual(conf.defaultCfg, conf.config_types) + self.assertCountEqual(conf.userCfg, conf.config_types) # Check conf parser are correct type for default_parser in conf.defaultCfg.values(): diff --git a/Lib/idlelib/idle_test/test_debugobj.py b/Lib/idlelib/idle_test/test_debugobj.py index 131ce22b8bb69b..90ace4e1bc4f9e 100644 --- a/Lib/idlelib/idle_test/test_debugobj.py +++ b/Lib/idlelib/idle_test/test_debugobj.py @@ -37,7 +37,7 @@ def test_isexpandable(self): def test_keys(self): ti = debugobj.SequenceTreeItem('label', 'abc') - self.assertEqual(list(ti.keys()), [0, 1, 2]) + self.assertEqual(list(ti.keys()), [0, 1, 2]) # keys() is a range. class DictTreeItemTest(unittest.TestCase): @@ -50,7 +50,7 @@ def test_isexpandable(self): def test_keys(self): ti = debugobj.DictTreeItem('label', {1:1, 0:0, 2:2}) - self.assertEqual(ti.keys(), [0, 1, 2]) + self.assertEqual(ti.keys(), [0, 1, 2]) # keys() is a sorted list. if __name__ == '__main__': diff --git a/Lib/idlelib/pyshell.py b/Lib/idlelib/pyshell.py index 3141b477eff181..0a7ee59f730a56 100755 --- a/Lib/idlelib/pyshell.py +++ b/Lib/idlelib/pyshell.py @@ -747,10 +747,11 @@ def showtraceback(self): self.tkconsole.open_stack_viewer() def checklinecache(self): - c = linecache.cache - for key in list(c.keys()): + "Remove keys other than ''." + cache = linecache.cache + for key in list(cache): # Iterate list because mutate cache. if key[:1] + key[-1:] != "<>": - del c[key] + del cache[key] def runcommand(self, code): "Run the code without invoking the debugger" diff --git a/Lib/idlelib/stackviewer.py b/Lib/idlelib/stackviewer.py index 7b00c4cdb7d033..4858cc682a4f45 100644 --- a/Lib/idlelib/stackviewer.py +++ b/Lib/idlelib/stackviewer.py @@ -99,7 +99,7 @@ def IsExpandable(self): def GetSubList(self): sublist = [] - for key in self.object.keys(): + for key in self.object.keys(): # self.object not necessarily dict. try: value = self.object[key] except KeyError: From 4e4a3e161f52d17e0bab9d1757dc1610962b215b Mon Sep 17 00:00:00 2001 From: Pablo Galindo Salgado Date: Wed, 18 Oct 2023 13:59:17 +0100 Subject: [PATCH 096/265] [3.11] gh-110696: Fix incorrect syntax error message for incorrect argument unpacking (GH-110706) (#110766) --- Grammar/python.gram | 3 +- Lib/test/test_syntax.py | 11 + ...-10-11-13-46-14.gh-issue-110696.J9kSzr.rst | 2 + Parser/parser.c | 2334 ++++++++++------- 4 files changed, 1361 insertions(+), 989 deletions(-) create mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-11-13-46-14.gh-issue-110696.J9kSzr.rst diff --git a/Grammar/python.gram b/Grammar/python.gram index bae8bc32242f9a..7b462fd0cd4157 100644 --- a/Grammar/python.gram +++ b/Grammar/python.gram @@ -1075,7 +1075,8 @@ func_type_comment[Token*]: # From here on, there are rules for invalid syntax with specialised error messages invalid_arguments: - | a=args ',' '*' { RAISE_SYNTAX_ERROR_KNOWN_LOCATION(a, "iterable argument unpacking follows keyword argument unpacking") } + | ((','.(starred_expression | ( assignment_expression | expression !':=') !'=')+ ',' kwargs) | kwargs) ',' b='*' { + RAISE_SYNTAX_ERROR_KNOWN_LOCATION(b, "iterable argument unpacking follows keyword argument unpacking") } | a=expression b=for_if_clauses ',' [args | expression for_if_clauses] { RAISE_SYNTAX_ERROR_KNOWN_RANGE(a, _PyPegen_get_last_comprehension_item(PyPegen_last_item(b, comprehension_ty)), "Generator expression must be parenthesized") } | a=NAME b='=' expression for_if_clauses { diff --git a/Lib/test/test_syntax.py b/Lib/test/test_syntax.py index 81456d62a37c49..07049087cdacc0 100644 --- a/Lib/test/test_syntax.py +++ b/Lib/test/test_syntax.py @@ -1827,6 +1827,17 @@ def f(x: *b) ^^^^^^^^^^^ SyntaxError: bytes can only contain ASCII literal characters + >>> f(**x, *y) + Traceback (most recent call last): + SyntaxError: iterable argument unpacking follows keyword argument unpacking + + >>> f(**x, *) + Traceback (most recent call last): + SyntaxError: iterable argument unpacking follows keyword argument unpacking + + >>> f(x, *:) + Traceback (most recent call last): + SyntaxError: invalid syntax """ import re diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-11-13-46-14.gh-issue-110696.J9kSzr.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-11-13-46-14.gh-issue-110696.J9kSzr.rst new file mode 100644 index 00000000000000..c845289d714f4c --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2023-10-11-13-46-14.gh-issue-110696.J9kSzr.rst @@ -0,0 +1,2 @@ +Fix incorrect error message for invalid argument unpacking. Patch by Pablo +Galindo diff --git a/Parser/parser.c b/Parser/parser.c index b2c0cfe3c2b9f7..af4c8933b181be 100644 --- a/Parser/parser.c +++ b/Parser/parser.c @@ -457,63 +457,63 @@ static char *soft_keywords[] = { #define _tmp_149_type 1377 #define _tmp_150_type 1378 #define _tmp_151_type 1379 -#define _loop0_152_type 1380 +#define _tmp_152_type 1380 #define _loop0_153_type 1381 #define _loop0_154_type 1382 -#define _tmp_155_type 1383 +#define _loop0_155_type 1383 #define _tmp_156_type 1384 #define _tmp_157_type 1385 #define _tmp_158_type 1386 -#define _loop0_159_type 1387 +#define _tmp_159_type 1387 #define _loop0_160_type 1388 -#define _loop1_161_type 1389 -#define _tmp_162_type 1390 -#define _loop0_163_type 1391 -#define _tmp_164_type 1392 -#define _loop0_165_type 1393 -#define _tmp_166_type 1394 -#define _loop0_167_type 1395 -#define _loop1_168_type 1396 -#define _tmp_169_type 1397 +#define _loop0_161_type 1389 +#define _loop1_162_type 1390 +#define _tmp_163_type 1391 +#define _loop0_164_type 1392 +#define _tmp_165_type 1393 +#define _loop0_166_type 1394 +#define _tmp_167_type 1395 +#define _loop0_168_type 1396 +#define _loop1_169_type 1397 #define _tmp_170_type 1398 #define _tmp_171_type 1399 -#define _loop0_172_type 1400 -#define _tmp_173_type 1401 +#define _tmp_172_type 1400 +#define _loop0_173_type 1401 #define _tmp_174_type 1402 -#define _loop1_175_type 1403 -#define _loop0_176_type 1404 +#define _tmp_175_type 1403 +#define _loop1_176_type 1404 #define _loop0_177_type 1405 -#define _loop0_179_type 1406 -#define _gather_178_type 1407 -#define _tmp_180_type 1408 -#define _loop0_181_type 1409 -#define _tmp_182_type 1410 -#define _loop0_183_type 1411 -#define _tmp_184_type 1412 -#define _loop0_185_type 1413 -#define _loop1_186_type 1414 +#define _loop0_178_type 1406 +#define _loop0_180_type 1407 +#define _gather_179_type 1408 +#define _tmp_181_type 1409 +#define _loop0_182_type 1410 +#define _tmp_183_type 1411 +#define _loop0_184_type 1412 +#define _tmp_185_type 1413 +#define _loop0_186_type 1414 #define _loop1_187_type 1415 -#define _tmp_188_type 1416 +#define _loop1_188_type 1416 #define _tmp_189_type 1417 -#define _loop0_190_type 1418 -#define _tmp_191_type 1419 +#define _tmp_190_type 1418 +#define _loop0_191_type 1419 #define _tmp_192_type 1420 #define _tmp_193_type 1421 -#define _loop0_195_type 1422 -#define _gather_194_type 1423 -#define _loop0_197_type 1424 -#define _gather_196_type 1425 -#define _loop0_199_type 1426 -#define _gather_198_type 1427 -#define _loop0_201_type 1428 -#define _gather_200_type 1429 -#define _tmp_202_type 1430 -#define _loop0_203_type 1431 -#define _loop1_204_type 1432 -#define _tmp_205_type 1433 -#define _loop0_206_type 1434 -#define _loop1_207_type 1435 -#define _tmp_208_type 1436 +#define _tmp_194_type 1422 +#define _loop0_196_type 1423 +#define _gather_195_type 1424 +#define _loop0_198_type 1425 +#define _gather_197_type 1426 +#define _loop0_200_type 1427 +#define _gather_199_type 1428 +#define _loop0_202_type 1429 +#define _gather_201_type 1430 +#define _tmp_203_type 1431 +#define _loop0_204_type 1432 +#define _loop1_205_type 1433 +#define _tmp_206_type 1434 +#define _loop0_207_type 1435 +#define _loop1_208_type 1436 #define _tmp_209_type 1437 #define _tmp_210_type 1438 #define _tmp_211_type 1439 @@ -523,9 +523,9 @@ static char *soft_keywords[] = { #define _tmp_215_type 1443 #define _tmp_216_type 1444 #define _tmp_217_type 1445 -#define _loop0_219_type 1446 -#define _gather_218_type 1447 -#define _tmp_220_type 1448 +#define _tmp_218_type 1446 +#define _loop0_220_type 1447 +#define _gather_219_type 1448 #define _tmp_221_type 1449 #define _tmp_222_type 1450 #define _tmp_223_type 1451 @@ -553,8 +553,14 @@ static char *soft_keywords[] = { #define _tmp_245_type 1473 #define _tmp_246_type 1474 #define _tmp_247_type 1475 -#define _tmp_248_type 1476 -#define _tmp_249_type 1477 +#define _loop0_249_type 1476 +#define _gather_248_type 1477 +#define _tmp_250_type 1478 +#define _tmp_251_type 1479 +#define _tmp_252_type 1480 +#define _tmp_253_type 1481 +#define _tmp_254_type 1482 +#define _tmp_255_type 1483 static mod_ty file_rule(Parser *p); static mod_ty interactive_rule(Parser *p); @@ -936,63 +942,63 @@ static void *_tmp_148_rule(Parser *p); static void *_tmp_149_rule(Parser *p); static void *_tmp_150_rule(Parser *p); static void *_tmp_151_rule(Parser *p); -static asdl_seq *_loop0_152_rule(Parser *p); +static void *_tmp_152_rule(Parser *p); static asdl_seq *_loop0_153_rule(Parser *p); static asdl_seq *_loop0_154_rule(Parser *p); -static void *_tmp_155_rule(Parser *p); +static asdl_seq *_loop0_155_rule(Parser *p); static void *_tmp_156_rule(Parser *p); static void *_tmp_157_rule(Parser *p); static void *_tmp_158_rule(Parser *p); -static asdl_seq *_loop0_159_rule(Parser *p); +static void *_tmp_159_rule(Parser *p); static asdl_seq *_loop0_160_rule(Parser *p); -static asdl_seq *_loop1_161_rule(Parser *p); -static void *_tmp_162_rule(Parser *p); -static asdl_seq *_loop0_163_rule(Parser *p); -static void *_tmp_164_rule(Parser *p); -static asdl_seq *_loop0_165_rule(Parser *p); -static void *_tmp_166_rule(Parser *p); -static asdl_seq *_loop0_167_rule(Parser *p); -static asdl_seq *_loop1_168_rule(Parser *p); -static void *_tmp_169_rule(Parser *p); +static asdl_seq *_loop0_161_rule(Parser *p); +static asdl_seq *_loop1_162_rule(Parser *p); +static void *_tmp_163_rule(Parser *p); +static asdl_seq *_loop0_164_rule(Parser *p); +static void *_tmp_165_rule(Parser *p); +static asdl_seq *_loop0_166_rule(Parser *p); +static void *_tmp_167_rule(Parser *p); +static asdl_seq *_loop0_168_rule(Parser *p); +static asdl_seq *_loop1_169_rule(Parser *p); static void *_tmp_170_rule(Parser *p); static void *_tmp_171_rule(Parser *p); -static asdl_seq *_loop0_172_rule(Parser *p); -static void *_tmp_173_rule(Parser *p); +static void *_tmp_172_rule(Parser *p); +static asdl_seq *_loop0_173_rule(Parser *p); static void *_tmp_174_rule(Parser *p); -static asdl_seq *_loop1_175_rule(Parser *p); -static asdl_seq *_loop0_176_rule(Parser *p); +static void *_tmp_175_rule(Parser *p); +static asdl_seq *_loop1_176_rule(Parser *p); static asdl_seq *_loop0_177_rule(Parser *p); -static asdl_seq *_loop0_179_rule(Parser *p); -static asdl_seq *_gather_178_rule(Parser *p); -static void *_tmp_180_rule(Parser *p); -static asdl_seq *_loop0_181_rule(Parser *p); -static void *_tmp_182_rule(Parser *p); -static asdl_seq *_loop0_183_rule(Parser *p); -static void *_tmp_184_rule(Parser *p); -static asdl_seq *_loop0_185_rule(Parser *p); -static asdl_seq *_loop1_186_rule(Parser *p); +static asdl_seq *_loop0_178_rule(Parser *p); +static asdl_seq *_loop0_180_rule(Parser *p); +static asdl_seq *_gather_179_rule(Parser *p); +static void *_tmp_181_rule(Parser *p); +static asdl_seq *_loop0_182_rule(Parser *p); +static void *_tmp_183_rule(Parser *p); +static asdl_seq *_loop0_184_rule(Parser *p); +static void *_tmp_185_rule(Parser *p); +static asdl_seq *_loop0_186_rule(Parser *p); static asdl_seq *_loop1_187_rule(Parser *p); -static void *_tmp_188_rule(Parser *p); +static asdl_seq *_loop1_188_rule(Parser *p); static void *_tmp_189_rule(Parser *p); -static asdl_seq *_loop0_190_rule(Parser *p); -static void *_tmp_191_rule(Parser *p); +static void *_tmp_190_rule(Parser *p); +static asdl_seq *_loop0_191_rule(Parser *p); static void *_tmp_192_rule(Parser *p); static void *_tmp_193_rule(Parser *p); -static asdl_seq *_loop0_195_rule(Parser *p); -static asdl_seq *_gather_194_rule(Parser *p); -static asdl_seq *_loop0_197_rule(Parser *p); -static asdl_seq *_gather_196_rule(Parser *p); -static asdl_seq *_loop0_199_rule(Parser *p); -static asdl_seq *_gather_198_rule(Parser *p); -static asdl_seq *_loop0_201_rule(Parser *p); -static asdl_seq *_gather_200_rule(Parser *p); -static void *_tmp_202_rule(Parser *p); -static asdl_seq *_loop0_203_rule(Parser *p); -static asdl_seq *_loop1_204_rule(Parser *p); -static void *_tmp_205_rule(Parser *p); -static asdl_seq *_loop0_206_rule(Parser *p); -static asdl_seq *_loop1_207_rule(Parser *p); -static void *_tmp_208_rule(Parser *p); +static void *_tmp_194_rule(Parser *p); +static asdl_seq *_loop0_196_rule(Parser *p); +static asdl_seq *_gather_195_rule(Parser *p); +static asdl_seq *_loop0_198_rule(Parser *p); +static asdl_seq *_gather_197_rule(Parser *p); +static asdl_seq *_loop0_200_rule(Parser *p); +static asdl_seq *_gather_199_rule(Parser *p); +static asdl_seq *_loop0_202_rule(Parser *p); +static asdl_seq *_gather_201_rule(Parser *p); +static void *_tmp_203_rule(Parser *p); +static asdl_seq *_loop0_204_rule(Parser *p); +static asdl_seq *_loop1_205_rule(Parser *p); +static void *_tmp_206_rule(Parser *p); +static asdl_seq *_loop0_207_rule(Parser *p); +static asdl_seq *_loop1_208_rule(Parser *p); static void *_tmp_209_rule(Parser *p); static void *_tmp_210_rule(Parser *p); static void *_tmp_211_rule(Parser *p); @@ -1002,9 +1008,9 @@ static void *_tmp_214_rule(Parser *p); static void *_tmp_215_rule(Parser *p); static void *_tmp_216_rule(Parser *p); static void *_tmp_217_rule(Parser *p); -static asdl_seq *_loop0_219_rule(Parser *p); -static asdl_seq *_gather_218_rule(Parser *p); -static void *_tmp_220_rule(Parser *p); +static void *_tmp_218_rule(Parser *p); +static asdl_seq *_loop0_220_rule(Parser *p); +static asdl_seq *_gather_219_rule(Parser *p); static void *_tmp_221_rule(Parser *p); static void *_tmp_222_rule(Parser *p); static void *_tmp_223_rule(Parser *p); @@ -1032,8 +1038,14 @@ static void *_tmp_244_rule(Parser *p); static void *_tmp_245_rule(Parser *p); static void *_tmp_246_rule(Parser *p); static void *_tmp_247_rule(Parser *p); -static void *_tmp_248_rule(Parser *p); -static void *_tmp_249_rule(Parser *p); +static asdl_seq *_loop0_249_rule(Parser *p); +static asdl_seq *_gather_248_rule(Parser *p); +static void *_tmp_250_rule(Parser *p); +static void *_tmp_251_rule(Parser *p); +static void *_tmp_252_rule(Parser *p); +static void *_tmp_253_rule(Parser *p); +static void *_tmp_254_rule(Parser *p); +static void *_tmp_255_rule(Parser *p); // file: statements? $ @@ -18915,7 +18927,7 @@ func_type_comment_rule(Parser *p) } // invalid_arguments: -// | args ',' '*' +// | ((','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs) | kwargs) ',' '*' // | expression for_if_clauses ',' [args | expression for_if_clauses] // | NAME '=' expression for_if_clauses // | args for_if_clauses @@ -18934,25 +18946,25 @@ invalid_arguments_rule(Parser *p) } void * _res = NULL; int _mark = p->mark; - { // args ',' '*' + { // ((','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs) | kwargs) ',' '*' if (p->error_indicator) { p->level--; return NULL; } - D(fprintf(stderr, "%*c> invalid_arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "args ',' '*'")); + D(fprintf(stderr, "%*c> invalid_arguments[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "((','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs) | kwargs) ',' '*'")); Token * _literal; - Token * _literal_1; - expr_ty a; + void *_tmp_144_var; + Token * b; if ( - (a = args_rule(p)) // args + (_tmp_144_var = _tmp_144_rule(p)) // (','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs) | kwargs && (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (_literal_1 = _PyPegen_expect_token(p, 16)) // token='*' + (b = _PyPegen_expect_token(p, 16)) // token='*' ) { - D(fprintf(stderr, "%*c+ invalid_arguments[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "args ',' '*'")); - _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( a , "iterable argument unpacking follows keyword argument unpacking" ); + D(fprintf(stderr, "%*c+ invalid_arguments[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "((','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs) | kwargs) ',' '*'")); + _res = RAISE_SYNTAX_ERROR_KNOWN_LOCATION ( b , "iterable argument unpacking follows keyword argument unpacking" ); if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; p->level--; @@ -18962,7 +18974,7 @@ invalid_arguments_rule(Parser *p) } p->mark = _mark; D(fprintf(stderr, "%*c%s invalid_arguments[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "args ',' '*'")); + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "((','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs) | kwargs) ',' '*'")); } { // expression for_if_clauses ',' [args | expression for_if_clauses] if (p->error_indicator) { @@ -18982,7 +18994,7 @@ invalid_arguments_rule(Parser *p) && (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (_opt_var = _tmp_144_rule(p), !p->error_indicator) // [args | expression for_if_clauses] + (_opt_var = _tmp_145_rule(p), !p->error_indicator) // [args | expression for_if_clauses] ) { D(fprintf(stderr, "%*c+ invalid_arguments[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression for_if_clauses ',' [args | expression for_if_clauses]")); @@ -19153,7 +19165,7 @@ invalid_kwarg_rule(Parser *p) Token* a; Token * b; if ( - (a = (Token*)_tmp_145_rule(p)) // 'True' | 'False' | 'None' + (a = (Token*)_tmp_146_rule(p)) // 'True' | 'False' | 'None' && (b = _PyPegen_expect_token(p, 22)) // token='=' ) @@ -19213,7 +19225,7 @@ invalid_kwarg_rule(Parser *p) expr_ty a; Token * b; if ( - _PyPegen_lookahead(0, _tmp_146_rule, p) + _PyPegen_lookahead(0, _tmp_147_rule, p) && (a = expression_rule(p)) // expression && @@ -19438,7 +19450,7 @@ invalid_expression_rule(Parser *p) expr_ty a; expr_ty b; if ( - _PyPegen_lookahead(0, _tmp_147_rule, p) + _PyPegen_lookahead(0, _tmp_148_rule, p) && (a = disjunction_rule(p)) // disjunction && @@ -19474,7 +19486,7 @@ invalid_expression_rule(Parser *p) && (b = disjunction_rule(p)) // disjunction && - _PyPegen_lookahead(0, _tmp_148_rule, p) + _PyPegen_lookahead(0, _tmp_149_rule, p) ) { D(fprintf(stderr, "%*c+ invalid_expression[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "disjunction 'if' disjunction !('else' | ':')")); @@ -19563,7 +19575,7 @@ invalid_named_expression_rule(Parser *p) && (b = bitwise_or_rule(p)) // bitwise_or && - _PyPegen_lookahead(0, _tmp_149_rule, p) + _PyPegen_lookahead(0, _tmp_150_rule, p) ) { D(fprintf(stderr, "%*c+ invalid_named_expression[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME '=' bitwise_or !('=' | ':=')")); @@ -19589,7 +19601,7 @@ invalid_named_expression_rule(Parser *p) Token * b; expr_ty bitwise_or_var; if ( - _PyPegen_lookahead(0, _tmp_150_rule, p) + _PyPegen_lookahead(0, _tmp_151_rule, p) && (a = bitwise_or_rule(p)) // bitwise_or && @@ -19597,7 +19609,7 @@ invalid_named_expression_rule(Parser *p) && (bitwise_or_var = bitwise_or_rule(p)) // bitwise_or && - _PyPegen_lookahead(0, _tmp_151_rule, p) + _PyPegen_lookahead(0, _tmp_152_rule, p) ) { D(fprintf(stderr, "%*c+ invalid_named_expression[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "!(list | tuple | genexp | 'True' | 'None' | 'False') bitwise_or '=' bitwise_or !('=' | ':=')")); @@ -19678,7 +19690,7 @@ invalid_assignment_rule(Parser *p) D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_named_expression ',' star_named_expressions* ':' expression")); Token * _literal; Token * _literal_1; - asdl_seq * _loop0_152_var; + asdl_seq * _loop0_153_var; expr_ty a; expr_ty expression_var; if ( @@ -19686,7 +19698,7 @@ invalid_assignment_rule(Parser *p) && (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (_loop0_152_var = _loop0_152_rule(p)) // star_named_expressions* + (_loop0_153_var = _loop0_153_rule(p)) // star_named_expressions* && (_literal_1 = _PyPegen_expect_token(p, 11)) // token=':' && @@ -19743,10 +19755,10 @@ invalid_assignment_rule(Parser *p) } D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "((star_targets '='))* star_expressions '='")); Token * _literal; - asdl_seq * _loop0_153_var; + asdl_seq * _loop0_154_var; expr_ty a; if ( - (_loop0_153_var = _loop0_153_rule(p)) // ((star_targets '='))* + (_loop0_154_var = _loop0_154_rule(p)) // ((star_targets '='))* && (a = star_expressions_rule(p)) // star_expressions && @@ -19773,10 +19785,10 @@ invalid_assignment_rule(Parser *p) } D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "((star_targets '='))* yield_expr '='")); Token * _literal; - asdl_seq * _loop0_154_var; + asdl_seq * _loop0_155_var; expr_ty a; if ( - (_loop0_154_var = _loop0_154_rule(p)) // ((star_targets '='))* + (_loop0_155_var = _loop0_155_rule(p)) // ((star_targets '='))* && (a = yield_expr_rule(p)) // yield_expr && @@ -19802,7 +19814,7 @@ invalid_assignment_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_assignment[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions augassign (yield_expr | star_expressions)")); - void *_tmp_155_var; + void *_tmp_156_var; expr_ty a; AugOperator* augassign_var; if ( @@ -19810,7 +19822,7 @@ invalid_assignment_rule(Parser *p) && (augassign_var = augassign_rule(p)) // augassign && - (_tmp_155_var = _tmp_155_rule(p)) // yield_expr | star_expressions + (_tmp_156_var = _tmp_156_rule(p)) // yield_expr | star_expressions ) { D(fprintf(stderr, "%*c+ invalid_assignment[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_expressions augassign (yield_expr | star_expressions)")); @@ -20036,11 +20048,11 @@ invalid_comprehension_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_comprehension[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('[' | '(' | '{') starred_expression for_if_clauses")); - void *_tmp_156_var; + void *_tmp_157_var; expr_ty a; asdl_comprehension_seq* for_if_clauses_var; if ( - (_tmp_156_var = _tmp_156_rule(p)) // '[' | '(' | '{' + (_tmp_157_var = _tmp_157_rule(p)) // '[' | '(' | '{' && (a = starred_expression_rule(p)) // starred_expression && @@ -20067,12 +20079,12 @@ invalid_comprehension_rule(Parser *p) } D(fprintf(stderr, "%*c> invalid_comprehension[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('[' | '{') star_named_expression ',' star_named_expressions for_if_clauses")); Token * _literal; - void *_tmp_157_var; + void *_tmp_158_var; expr_ty a; asdl_expr_seq* b; asdl_comprehension_seq* for_if_clauses_var; if ( - (_tmp_157_var = _tmp_157_rule(p)) // '[' | '{' + (_tmp_158_var = _tmp_158_rule(p)) // '[' | '{' && (a = star_named_expression_rule(p)) // star_named_expression && @@ -20102,12 +20114,12 @@ invalid_comprehension_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_comprehension[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('[' | '{') star_named_expression ',' for_if_clauses")); - void *_tmp_158_var; + void *_tmp_159_var; expr_ty a; Token * b; asdl_comprehension_seq* for_if_clauses_var; if ( - (_tmp_158_var = _tmp_158_rule(p)) // '[' | '{' + (_tmp_159_var = _tmp_159_rule(p)) // '[' | '{' && (a = star_named_expression_rule(p)) // star_named_expression && @@ -20217,11 +20229,11 @@ invalid_parameters_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default* invalid_parameters_helper param_no_default")); - asdl_seq * _loop0_159_var; + asdl_seq * _loop0_160_var; arg_ty a; void *invalid_parameters_helper_var; if ( - (_loop0_159_var = _loop0_159_rule(p)) // param_no_default* + (_loop0_160_var = _loop0_160_rule(p)) // param_no_default* && (invalid_parameters_helper_var = invalid_parameters_helper_rule(p)) // invalid_parameters_helper && @@ -20247,18 +20259,18 @@ invalid_parameters_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default* '(' param_no_default+ ','? ')'")); - asdl_seq * _loop0_160_var; - asdl_seq * _loop1_161_var; + asdl_seq * _loop0_161_var; + asdl_seq * _loop1_162_var; void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings Token * a; Token * b; if ( - (_loop0_160_var = _loop0_160_rule(p)) // param_no_default* + (_loop0_161_var = _loop0_161_rule(p)) // param_no_default* && (a = _PyPegen_expect_token(p, 7)) // token='(' && - (_loop1_161_var = _loop1_161_rule(p)) // param_no_default+ + (_loop1_162_var = _loop1_162_rule(p)) // param_no_default+ && (_opt_var = _PyPegen_expect_token(p, 12), !p->error_indicator) // ','? && @@ -20311,13 +20323,13 @@ invalid_parameters_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(slash_no_default | slash_with_default) param_maybe_default* '/'")); - asdl_seq * _loop0_163_var; - void *_tmp_162_var; + asdl_seq * _loop0_164_var; + void *_tmp_163_var; Token * a; if ( - (_tmp_162_var = _tmp_162_rule(p)) // slash_no_default | slash_with_default + (_tmp_163_var = _tmp_163_rule(p)) // slash_no_default | slash_with_default && - (_loop0_163_var = _loop0_163_rule(p)) // param_maybe_default* + (_loop0_164_var = _loop0_164_rule(p)) // param_maybe_default* && (a = _PyPegen_expect_token(p, 17)) // token='/' ) @@ -20342,22 +20354,22 @@ invalid_parameters_rule(Parser *p) } D(fprintf(stderr, "%*c> invalid_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "[(slash_no_default | slash_with_default)] param_maybe_default* '*' (',' | param_no_default) param_maybe_default* '/'")); Token * _literal; - asdl_seq * _loop0_165_var; - asdl_seq * _loop0_167_var; + asdl_seq * _loop0_166_var; + asdl_seq * _loop0_168_var; void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings - void *_tmp_166_var; + void *_tmp_167_var; Token * a; if ( - (_opt_var = _tmp_164_rule(p), !p->error_indicator) // [(slash_no_default | slash_with_default)] + (_opt_var = _tmp_165_rule(p), !p->error_indicator) // [(slash_no_default | slash_with_default)] && - (_loop0_165_var = _loop0_165_rule(p)) // param_maybe_default* + (_loop0_166_var = _loop0_166_rule(p)) // param_maybe_default* && (_literal = _PyPegen_expect_token(p, 16)) // token='*' && - (_tmp_166_var = _tmp_166_rule(p)) // ',' | param_no_default + (_tmp_167_var = _tmp_167_rule(p)) // ',' | param_no_default && - (_loop0_167_var = _loop0_167_rule(p)) // param_maybe_default* + (_loop0_168_var = _loop0_168_rule(p)) // param_maybe_default* && (a = _PyPegen_expect_token(p, 17)) // token='/' ) @@ -20382,10 +20394,10 @@ invalid_parameters_rule(Parser *p) } D(fprintf(stderr, "%*c> invalid_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_maybe_default+ '/' '*'")); Token * _literal; - asdl_seq * _loop1_168_var; + asdl_seq * _loop1_169_var; Token * a; if ( - (_loop1_168_var = _loop1_168_rule(p)) // param_maybe_default+ + (_loop1_169_var = _loop1_169_rule(p)) // param_maybe_default+ && (_literal = _PyPegen_expect_token(p, 17)) // token='/' && @@ -20435,7 +20447,7 @@ invalid_default_rule(Parser *p) if ( (a = _PyPegen_expect_token(p, 22)) // token='=' && - _PyPegen_lookahead(1, _tmp_169_rule, p) + _PyPegen_lookahead(1, _tmp_170_rule, p) ) { D(fprintf(stderr, "%*c+ invalid_default[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'=' &(')' | ',')")); @@ -20481,12 +20493,12 @@ invalid_star_etc_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' (')' | ',' (')' | '**'))")); - void *_tmp_170_var; + void *_tmp_171_var; Token * a; if ( (a = _PyPegen_expect_token(p, 16)) // token='*' && - (_tmp_170_var = _tmp_170_rule(p)) // ')' | ',' (')' | '**') + (_tmp_171_var = _tmp_171_rule(p)) // ')' | ',' (')' | '**') ) { D(fprintf(stderr, "%*c+ invalid_star_etc[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' (')' | ',' (')' | '**'))")); @@ -20569,20 +20581,20 @@ invalid_star_etc_rule(Parser *p) } D(fprintf(stderr, "%*c> invalid_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' (param_no_default | ',') param_maybe_default* '*' (param_no_default | ',')")); Token * _literal; - asdl_seq * _loop0_172_var; - void *_tmp_171_var; - void *_tmp_173_var; + asdl_seq * _loop0_173_var; + void *_tmp_172_var; + void *_tmp_174_var; Token * a; if ( (_literal = _PyPegen_expect_token(p, 16)) // token='*' && - (_tmp_171_var = _tmp_171_rule(p)) // param_no_default | ',' + (_tmp_172_var = _tmp_172_rule(p)) // param_no_default | ',' && - (_loop0_172_var = _loop0_172_rule(p)) // param_maybe_default* + (_loop0_173_var = _loop0_173_rule(p)) // param_maybe_default* && (a = _PyPegen_expect_token(p, 16)) // token='*' && - (_tmp_173_var = _tmp_173_rule(p)) // param_no_default | ',' + (_tmp_174_var = _tmp_174_rule(p)) // param_no_default | ',' ) { D(fprintf(stderr, "%*c+ invalid_star_etc[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' (param_no_default | ',') param_maybe_default* '*' (param_no_default | ',')")); @@ -20698,7 +20710,7 @@ invalid_kwds_rule(Parser *p) && (_literal_1 = _PyPegen_expect_token(p, 12)) // token=',' && - (a = (Token*)_tmp_174_rule(p)) // '*' | '**' | '/' + (a = (Token*)_tmp_175_rule(p)) // '*' | '**' | '/' ) { D(fprintf(stderr, "%*c+ invalid_kwds[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**' param ',' ('*' | '**' | '/')")); @@ -20764,13 +20776,13 @@ invalid_parameters_helper_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_parameters_helper[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default+")); - asdl_seq * _loop1_175_var; + asdl_seq * _loop1_176_var; if ( - (_loop1_175_var = _loop1_175_rule(p)) // param_with_default+ + (_loop1_176_var = _loop1_176_rule(p)) // param_with_default+ ) { D(fprintf(stderr, "%*c+ invalid_parameters_helper[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "param_with_default+")); - _res = _loop1_175_var; + _res = _loop1_176_var; goto done; } p->mark = _mark; @@ -20809,11 +20821,11 @@ invalid_lambda_parameters_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default* invalid_lambda_parameters_helper lambda_param_no_default")); - asdl_seq * _loop0_176_var; + asdl_seq * _loop0_177_var; arg_ty a; void *invalid_lambda_parameters_helper_var; if ( - (_loop0_176_var = _loop0_176_rule(p)) // lambda_param_no_default* + (_loop0_177_var = _loop0_177_rule(p)) // lambda_param_no_default* && (invalid_lambda_parameters_helper_var = invalid_lambda_parameters_helper_rule(p)) // invalid_lambda_parameters_helper && @@ -20839,18 +20851,18 @@ invalid_lambda_parameters_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default* '(' ','.lambda_param+ ','? ')'")); - asdl_seq * _gather_178_var; - asdl_seq * _loop0_177_var; + asdl_seq * _gather_179_var; + asdl_seq * _loop0_178_var; void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings Token * a; Token * b; if ( - (_loop0_177_var = _loop0_177_rule(p)) // lambda_param_no_default* + (_loop0_178_var = _loop0_178_rule(p)) // lambda_param_no_default* && (a = _PyPegen_expect_token(p, 7)) // token='(' && - (_gather_178_var = _gather_178_rule(p)) // ','.lambda_param+ + (_gather_179_var = _gather_179_rule(p)) // ','.lambda_param+ && (_opt_var = _PyPegen_expect_token(p, 12), !p->error_indicator) // ','? && @@ -20903,13 +20915,13 @@ invalid_lambda_parameters_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(lambda_slash_no_default | lambda_slash_with_default) lambda_param_maybe_default* '/'")); - asdl_seq * _loop0_181_var; - void *_tmp_180_var; + asdl_seq * _loop0_182_var; + void *_tmp_181_var; Token * a; if ( - (_tmp_180_var = _tmp_180_rule(p)) // lambda_slash_no_default | lambda_slash_with_default + (_tmp_181_var = _tmp_181_rule(p)) // lambda_slash_no_default | lambda_slash_with_default && - (_loop0_181_var = _loop0_181_rule(p)) // lambda_param_maybe_default* + (_loop0_182_var = _loop0_182_rule(p)) // lambda_param_maybe_default* && (a = _PyPegen_expect_token(p, 17)) // token='/' ) @@ -20934,22 +20946,22 @@ invalid_lambda_parameters_rule(Parser *p) } D(fprintf(stderr, "%*c> invalid_lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "[(lambda_slash_no_default | lambda_slash_with_default)] lambda_param_maybe_default* '*' (',' | lambda_param_no_default) lambda_param_maybe_default* '/'")); Token * _literal; - asdl_seq * _loop0_183_var; - asdl_seq * _loop0_185_var; + asdl_seq * _loop0_184_var; + asdl_seq * _loop0_186_var; void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings - void *_tmp_184_var; + void *_tmp_185_var; Token * a; if ( - (_opt_var = _tmp_182_rule(p), !p->error_indicator) // [(lambda_slash_no_default | lambda_slash_with_default)] + (_opt_var = _tmp_183_rule(p), !p->error_indicator) // [(lambda_slash_no_default | lambda_slash_with_default)] && - (_loop0_183_var = _loop0_183_rule(p)) // lambda_param_maybe_default* + (_loop0_184_var = _loop0_184_rule(p)) // lambda_param_maybe_default* && (_literal = _PyPegen_expect_token(p, 16)) // token='*' && - (_tmp_184_var = _tmp_184_rule(p)) // ',' | lambda_param_no_default + (_tmp_185_var = _tmp_185_rule(p)) // ',' | lambda_param_no_default && - (_loop0_185_var = _loop0_185_rule(p)) // lambda_param_maybe_default* + (_loop0_186_var = _loop0_186_rule(p)) // lambda_param_maybe_default* && (a = _PyPegen_expect_token(p, 17)) // token='/' ) @@ -20974,10 +20986,10 @@ invalid_lambda_parameters_rule(Parser *p) } D(fprintf(stderr, "%*c> invalid_lambda_parameters[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_maybe_default+ '/' '*'")); Token * _literal; - asdl_seq * _loop1_186_var; + asdl_seq * _loop1_187_var; Token * a; if ( - (_loop1_186_var = _loop1_186_rule(p)) // lambda_param_maybe_default+ + (_loop1_187_var = _loop1_187_rule(p)) // lambda_param_maybe_default+ && (_literal = _PyPegen_expect_token(p, 17)) // token='/' && @@ -21049,13 +21061,13 @@ invalid_lambda_parameters_helper_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_lambda_parameters_helper[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default+")); - asdl_seq * _loop1_187_var; + asdl_seq * _loop1_188_var; if ( - (_loop1_187_var = _loop1_187_rule(p)) // lambda_param_with_default+ + (_loop1_188_var = _loop1_188_rule(p)) // lambda_param_with_default+ ) { D(fprintf(stderr, "%*c+ invalid_lambda_parameters_helper[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default+")); - _res = _loop1_187_var; + _res = _loop1_188_var; goto done; } p->mark = _mark; @@ -21092,11 +21104,11 @@ invalid_lambda_star_etc_rule(Parser *p) } D(fprintf(stderr, "%*c> invalid_lambda_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' (':' | ',' (':' | '**'))")); Token * _literal; - void *_tmp_188_var; + void *_tmp_189_var; if ( (_literal = _PyPegen_expect_token(p, 16)) // token='*' && - (_tmp_188_var = _tmp_188_rule(p)) // ':' | ',' (':' | '**') + (_tmp_189_var = _tmp_189_rule(p)) // ':' | ',' (':' | '**') ) { D(fprintf(stderr, "%*c+ invalid_lambda_star_etc[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' (':' | ',' (':' | '**'))")); @@ -21149,20 +21161,20 @@ invalid_lambda_star_etc_rule(Parser *p) } D(fprintf(stderr, "%*c> invalid_lambda_star_etc[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*' (lambda_param_no_default | ',') lambda_param_maybe_default* '*' (lambda_param_no_default | ',')")); Token * _literal; - asdl_seq * _loop0_190_var; - void *_tmp_189_var; - void *_tmp_191_var; + asdl_seq * _loop0_191_var; + void *_tmp_190_var; + void *_tmp_192_var; Token * a; if ( (_literal = _PyPegen_expect_token(p, 16)) // token='*' && - (_tmp_189_var = _tmp_189_rule(p)) // lambda_param_no_default | ',' + (_tmp_190_var = _tmp_190_rule(p)) // lambda_param_no_default | ',' && - (_loop0_190_var = _loop0_190_rule(p)) // lambda_param_maybe_default* + (_loop0_191_var = _loop0_191_rule(p)) // lambda_param_maybe_default* && (a = _PyPegen_expect_token(p, 16)) // token='*' && - (_tmp_191_var = _tmp_191_rule(p)) // lambda_param_no_default | ',' + (_tmp_192_var = _tmp_192_rule(p)) // lambda_param_no_default | ',' ) { D(fprintf(stderr, "%*c+ invalid_lambda_star_etc[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*' (lambda_param_no_default | ',') lambda_param_maybe_default* '*' (lambda_param_no_default | ',')")); @@ -21281,7 +21293,7 @@ invalid_lambda_kwds_rule(Parser *p) && (_literal_1 = _PyPegen_expect_token(p, 12)) // token=',' && - (a = (Token*)_tmp_192_rule(p)) // '*' | '**' | '/' + (a = (Token*)_tmp_193_rule(p)) // '*' | '**' | '/' ) { D(fprintf(stderr, "%*c+ invalid_lambda_kwds[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**' lambda_param ',' ('*' | '**' | '/')")); @@ -21389,7 +21401,7 @@ invalid_with_item_rule(Parser *p) && (a = expression_rule(p)) // expression && - _PyPegen_lookahead(1, _tmp_193_rule, p) + _PyPegen_lookahead(1, _tmp_194_rule, p) ) { D(fprintf(stderr, "%*c+ invalid_with_item[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression 'as' expression &(',' | ')' | ':')")); @@ -21617,7 +21629,7 @@ invalid_with_stmt_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' ','.(expression ['as' star_target])+ NEWLINE")); - asdl_seq * _gather_194_var; + asdl_seq * _gather_195_var; Token * _keyword; void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings @@ -21627,7 +21639,7 @@ invalid_with_stmt_rule(Parser *p) && (_keyword = _PyPegen_expect_token(p, 612)) // token='with' && - (_gather_194_var = _gather_194_rule(p)) // ','.(expression ['as' star_target])+ + (_gather_195_var = _gather_195_rule(p)) // ','.(expression ['as' star_target])+ && (newline_var = _PyPegen_expect_token(p, NEWLINE)) // token='NEWLINE' ) @@ -21651,7 +21663,7 @@ invalid_with_stmt_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_with_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' NEWLINE")); - asdl_seq * _gather_196_var; + asdl_seq * _gather_197_var; Token * _keyword; Token * _literal; Token * _literal_1; @@ -21667,7 +21679,7 @@ invalid_with_stmt_rule(Parser *p) && (_literal = _PyPegen_expect_token(p, 7)) // token='(' && - (_gather_196_var = _gather_196_rule(p)) // ','.(expressions ['as' star_target])+ + (_gather_197_var = _gather_197_rule(p)) // ','.(expressions ['as' star_target])+ && (_opt_var_1 = _PyPegen_expect_token(p, 12), !p->error_indicator) // ','? && @@ -21717,7 +21729,7 @@ invalid_with_stmt_indent_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_with_stmt_indent[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' ','.(expression ['as' star_target])+ ':' NEWLINE !INDENT")); - asdl_seq * _gather_198_var; + asdl_seq * _gather_199_var; Token * _literal; void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings @@ -21728,7 +21740,7 @@ invalid_with_stmt_indent_rule(Parser *p) && (a = _PyPegen_expect_token(p, 612)) // token='with' && - (_gather_198_var = _gather_198_rule(p)) // ','.(expression ['as' star_target])+ + (_gather_199_var = _gather_199_rule(p)) // ','.(expression ['as' star_target])+ && (_literal = _PyPegen_expect_token(p, 11)) // token=':' && @@ -21756,7 +21768,7 @@ invalid_with_stmt_indent_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_with_stmt_indent[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "ASYNC? 'with' '(' ','.(expressions ['as' star_target])+ ','? ')' ':' NEWLINE !INDENT")); - asdl_seq * _gather_200_var; + asdl_seq * _gather_201_var; Token * _literal; Token * _literal_1; Token * _literal_2; @@ -21773,7 +21785,7 @@ invalid_with_stmt_indent_rule(Parser *p) && (_literal = _PyPegen_expect_token(p, 7)) // token='(' && - (_gather_200_var = _gather_200_rule(p)) // ','.(expressions ['as' star_target])+ + (_gather_201_var = _gather_201_rule(p)) // ','.(expressions ['as' star_target])+ && (_opt_var_1 = _PyPegen_expect_token(p, 12), !p->error_indicator) // ','? && @@ -21871,7 +21883,7 @@ invalid_try_stmt_rule(Parser *p) && (block_var = block_rule(p)) // block && - _PyPegen_lookahead(0, _tmp_202_rule, p) + _PyPegen_lookahead(0, _tmp_203_rule, p) ) { D(fprintf(stderr, "%*c+ invalid_try_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'try' ':' block !('except' | 'finally')")); @@ -21896,8 +21908,8 @@ invalid_try_stmt_rule(Parser *p) Token * _keyword; Token * _literal; Token * _literal_1; - asdl_seq * _loop0_203_var; - asdl_seq * _loop1_204_var; + asdl_seq * _loop0_204_var; + asdl_seq * _loop1_205_var; void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings Token * a; @@ -21908,9 +21920,9 @@ invalid_try_stmt_rule(Parser *p) && (_literal = _PyPegen_expect_token(p, 11)) // token=':' && - (_loop0_203_var = _loop0_203_rule(p)) // block* + (_loop0_204_var = _loop0_204_rule(p)) // block* && - (_loop1_204_var = _loop1_204_rule(p)) // except_block+ + (_loop1_205_var = _loop1_205_rule(p)) // except_block+ && (a = _PyPegen_expect_token(p, 634)) // token='except' && @@ -21918,7 +21930,7 @@ invalid_try_stmt_rule(Parser *p) && (expression_var = expression_rule(p)) // expression && - (_opt_var = _tmp_205_rule(p), !p->error_indicator) // ['as' NAME] + (_opt_var = _tmp_206_rule(p), !p->error_indicator) // ['as' NAME] && (_literal_1 = _PyPegen_expect_token(p, 11)) // token=':' ) @@ -21945,8 +21957,8 @@ invalid_try_stmt_rule(Parser *p) Token * _keyword; Token * _literal; Token * _literal_1; - asdl_seq * _loop0_206_var; - asdl_seq * _loop1_207_var; + asdl_seq * _loop0_207_var; + asdl_seq * _loop1_208_var; void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings Token * a; @@ -21955,13 +21967,13 @@ invalid_try_stmt_rule(Parser *p) && (_literal = _PyPegen_expect_token(p, 11)) // token=':' && - (_loop0_206_var = _loop0_206_rule(p)) // block* + (_loop0_207_var = _loop0_207_rule(p)) // block* && - (_loop1_207_var = _loop1_207_rule(p)) // except_star_block+ + (_loop1_208_var = _loop1_208_rule(p)) // except_star_block+ && (a = _PyPegen_expect_token(p, 634)) // token='except' && - (_opt_var = _tmp_208_rule(p), !p->error_indicator) // [expression ['as' NAME]] + (_opt_var = _tmp_209_rule(p), !p->error_indicator) // [expression ['as' NAME]] && (_literal_1 = _PyPegen_expect_token(p, 11)) // token=':' ) @@ -22029,7 +22041,7 @@ invalid_except_stmt_rule(Parser *p) && (expressions_var = expressions_rule(p)) // expressions && - (_opt_var_1 = _tmp_209_rule(p), !p->error_indicator) // ['as' NAME] + (_opt_var_1 = _tmp_210_rule(p), !p->error_indicator) // ['as' NAME] && (_literal_1 = _PyPegen_expect_token(p, 11)) // token=':' ) @@ -22067,7 +22079,7 @@ invalid_except_stmt_rule(Parser *p) && (expression_var = expression_rule(p)) // expression && - (_opt_var_1 = _tmp_210_rule(p), !p->error_indicator) // ['as' NAME] + (_opt_var_1 = _tmp_211_rule(p), !p->error_indicator) // ['as' NAME] && (newline_var = _PyPegen_expect_token(p, NEWLINE)) // token='NEWLINE' ) @@ -22119,14 +22131,14 @@ invalid_except_stmt_rule(Parser *p) } D(fprintf(stderr, "%*c> invalid_except_stmt[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except' '*' (NEWLINE | ':')")); Token * _literal; - void *_tmp_211_var; + void *_tmp_212_var; Token * a; if ( (a = _PyPegen_expect_token(p, 634)) // token='except' && (_literal = _PyPegen_expect_token(p, 16)) // token='*' && - (_tmp_211_var = _tmp_211_rule(p)) // NEWLINE | ':' + (_tmp_212_var = _tmp_212_rule(p)) // NEWLINE | ':' ) { D(fprintf(stderr, "%*c+ invalid_except_stmt[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'except' '*' (NEWLINE | ':')")); @@ -22233,7 +22245,7 @@ invalid_except_stmt_indent_rule(Parser *p) && (expression_var = expression_rule(p)) // expression && - (_opt_var = _tmp_212_rule(p), !p->error_indicator) // ['as' NAME] + (_opt_var = _tmp_213_rule(p), !p->error_indicator) // ['as' NAME] && (_literal = _PyPegen_expect_token(p, 11)) // token=':' && @@ -22328,7 +22340,7 @@ invalid_except_star_stmt_indent_rule(Parser *p) && (expression_var = expression_rule(p)) // expression && - (_opt_var = _tmp_213_rule(p), !p->error_indicator) // ['as' NAME] + (_opt_var = _tmp_214_rule(p), !p->error_indicator) // ['as' NAME] && (_literal_1 = _PyPegen_expect_token(p, 11)) // token=':' && @@ -22697,7 +22709,7 @@ invalid_class_argument_pattern_rule(Parser *p) asdl_pattern_seq* a; asdl_seq* keyword_patterns_var; if ( - (_opt_var = _tmp_214_rule(p), !p->error_indicator) // [positional_patterns ','] + (_opt_var = _tmp_215_rule(p), !p->error_indicator) // [positional_patterns ','] && (keyword_patterns_var = keyword_patterns_rule(p)) // keyword_patterns && @@ -23191,7 +23203,7 @@ invalid_def_raw_rule(Parser *p) && (_literal_1 = _PyPegen_expect_token(p, 8)) // token=')' && - (_opt_var_2 = _tmp_215_rule(p), !p->error_indicator) // ['->' expression] + (_opt_var_2 = _tmp_216_rule(p), !p->error_indicator) // ['->' expression] && (_literal_2 = _PyPegen_expect_token(p, 11)) // token=':' && @@ -23251,7 +23263,7 @@ invalid_class_def_raw_rule(Parser *p) && (name_var = _PyPegen_name_token(p)) // NAME && - (_opt_var = _tmp_216_rule(p), !p->error_indicator) // ['(' arguments? ')'] + (_opt_var = _tmp_217_rule(p), !p->error_indicator) // ['(' arguments? ')'] && (newline_var = _PyPegen_expect_token(p, NEWLINE)) // token='NEWLINE' ) @@ -23286,7 +23298,7 @@ invalid_class_def_raw_rule(Parser *p) && (name_var = _PyPegen_name_token(p)) // NAME && - (_opt_var = _tmp_217_rule(p), !p->error_indicator) // ['(' arguments? ')'] + (_opt_var = _tmp_218_rule(p), !p->error_indicator) // ['(' arguments? ')'] && (_literal = _PyPegen_expect_token(p, 11)) // token=':' && @@ -23337,11 +23349,11 @@ invalid_double_starred_kvpairs_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> invalid_double_starred_kvpairs[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.double_starred_kvpair+ ',' invalid_kvpair")); - asdl_seq * _gather_218_var; + asdl_seq * _gather_219_var; Token * _literal; void *invalid_kvpair_var; if ( - (_gather_218_var = _gather_218_rule(p)) // ','.double_starred_kvpair+ + (_gather_219_var = _gather_219_rule(p)) // ','.double_starred_kvpair+ && (_literal = _PyPegen_expect_token(p, 12)) // token=',' && @@ -23349,7 +23361,7 @@ invalid_double_starred_kvpairs_rule(Parser *p) ) { D(fprintf(stderr, "%*c+ invalid_double_starred_kvpairs[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','.double_starred_kvpair+ ',' invalid_kvpair")); - _res = _PyPegen_dummy_name(p, _gather_218_var, _literal, invalid_kvpair_var); + _res = _PyPegen_dummy_name(p, _gather_219_var, _literal, invalid_kvpair_var); goto done; } p->mark = _mark; @@ -23402,7 +23414,7 @@ invalid_double_starred_kvpairs_rule(Parser *p) && (a = _PyPegen_expect_token(p, 11)) // token=':' && - _PyPegen_lookahead(1, _tmp_220_rule, p) + _PyPegen_lookahead(1, _tmp_221_rule, p) ) { D(fprintf(stderr, "%*c+ invalid_double_starred_kvpairs[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ':' &('}' | ',')")); @@ -23513,7 +23525,7 @@ invalid_kvpair_rule(Parser *p) && (a = _PyPegen_expect_token(p, 11)) // token=':' && - _PyPegen_lookahead(1, _tmp_221_rule, p) + _PyPegen_lookahead(1, _tmp_222_rule, p) ) { D(fprintf(stderr, "%*c+ invalid_kvpair[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ':' &('}' | ',')")); @@ -24364,12 +24376,12 @@ _loop1_14_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_14[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(star_targets '=')")); - void *_tmp_222_var; + void *_tmp_223_var; while ( - (_tmp_222_var = _tmp_222_rule(p)) // star_targets '=' + (_tmp_223_var = _tmp_223_rule(p)) // star_targets '=' ) { - _res = _tmp_222_var; + _res = _tmp_223_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -24943,12 +24955,12 @@ _loop0_24_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop0_24[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('.' | '...')")); - void *_tmp_223_var; + void *_tmp_224_var; while ( - (_tmp_223_var = _tmp_223_rule(p)) // '.' | '...' + (_tmp_224_var = _tmp_224_rule(p)) // '.' | '...' ) { - _res = _tmp_223_var; + _res = _tmp_224_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -25011,12 +25023,12 @@ _loop1_25_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_25[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('.' | '...')")); - void *_tmp_224_var; + void *_tmp_225_var; while ( - (_tmp_224_var = _tmp_224_rule(p)) // '.' | '...' + (_tmp_225_var = _tmp_225_rule(p)) // '.' | '...' ) { - _res = _tmp_224_var; + _res = _tmp_225_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -25416,12 +25428,12 @@ _loop1_32_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_32[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('@' named_expression NEWLINE)")); - void *_tmp_225_var; + void *_tmp_226_var; while ( - (_tmp_225_var = _tmp_225_rule(p)) // '@' named_expression NEWLINE + (_tmp_226_var = _tmp_226_rule(p)) // '@' named_expression NEWLINE ) { - _res = _tmp_225_var; + _res = _tmp_226_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -28477,12 +28489,12 @@ _loop1_80_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_80[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' expression)")); - void *_tmp_226_var; + void *_tmp_227_var; while ( - (_tmp_226_var = _tmp_226_rule(p)) // ',' expression + (_tmp_227_var = _tmp_227_rule(p)) // ',' expression ) { - _res = _tmp_226_var; + _res = _tmp_227_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -28550,12 +28562,12 @@ _loop1_81_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_81[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' star_expression)")); - void *_tmp_227_var; + void *_tmp_228_var; while ( - (_tmp_227_var = _tmp_227_rule(p)) // ',' star_expression + (_tmp_228_var = _tmp_228_rule(p)) // ',' star_expression ) { - _res = _tmp_227_var; + _res = _tmp_228_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -28742,12 +28754,12 @@ _loop1_84_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_84[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('or' conjunction)")); - void *_tmp_228_var; + void *_tmp_229_var; while ( - (_tmp_228_var = _tmp_228_rule(p)) // 'or' conjunction + (_tmp_229_var = _tmp_229_rule(p)) // 'or' conjunction ) { - _res = _tmp_228_var; + _res = _tmp_229_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -28815,12 +28827,12 @@ _loop1_85_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_85[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('and' inversion)")); - void *_tmp_229_var; + void *_tmp_230_var; while ( - (_tmp_229_var = _tmp_229_rule(p)) // 'and' inversion + (_tmp_230_var = _tmp_230_rule(p)) // 'and' inversion ) { - _res = _tmp_229_var; + _res = _tmp_230_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -29010,7 +29022,7 @@ _loop0_89_rule(Parser *p) while ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (elem = _tmp_230_rule(p)) // slice | starred_expression + (elem = _tmp_231_rule(p)) // slice | starred_expression ) { _res = elem; @@ -29076,7 +29088,7 @@ _gather_88_rule(Parser *p) void *elem; asdl_seq * seq; if ( - (elem = _tmp_230_rule(p)) // slice | starred_expression + (elem = _tmp_231_rule(p)) // slice | starred_expression && (seq = _loop0_89_rule(p)) // _loop0_89 ) @@ -30762,12 +30774,12 @@ _loop0_114_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop0_114[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('if' disjunction)")); - void *_tmp_231_var; + void *_tmp_232_var; while ( - (_tmp_231_var = _tmp_231_rule(p)) // 'if' disjunction + (_tmp_232_var = _tmp_232_rule(p)) // 'if' disjunction ) { - _res = _tmp_231_var; + _res = _tmp_232_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -30830,12 +30842,12 @@ _loop0_115_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop0_115[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "('if' disjunction)")); - void *_tmp_232_var; + void *_tmp_233_var; while ( - (_tmp_232_var = _tmp_232_rule(p)) // 'if' disjunction + (_tmp_233_var = _tmp_233_rule(p)) // 'if' disjunction ) { - _res = _tmp_232_var; + _res = _tmp_233_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -30963,7 +30975,7 @@ _loop0_118_rule(Parser *p) while ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (elem = _tmp_233_rule(p)) // starred_expression | (assignment_expression | expression !':=') !'=' + (elem = _tmp_234_rule(p)) // starred_expression | (assignment_expression | expression !':=') !'=' ) { _res = elem; @@ -31030,7 +31042,7 @@ _gather_117_rule(Parser *p) void *elem; asdl_seq * seq; if ( - (elem = _tmp_233_rule(p)) // starred_expression | (assignment_expression | expression !':=') !'=' + (elem = _tmp_234_rule(p)) // starred_expression | (assignment_expression | expression !':=') !'=' && (seq = _loop0_118_rule(p)) // _loop0_118 ) @@ -31601,12 +31613,12 @@ _loop0_128_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop0_128[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' star_target)")); - void *_tmp_234_var; + void *_tmp_235_var; while ( - (_tmp_234_var = _tmp_234_rule(p)) // ',' star_target + (_tmp_235_var = _tmp_235_rule(p)) // ',' star_target ) { - _res = _tmp_234_var; + _res = _tmp_235_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -31788,12 +31800,12 @@ _loop1_131_rule(Parser *p) return NULL; } D(fprintf(stderr, "%*c> _loop1_131[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(',' star_target)")); - void *_tmp_235_var; + void *_tmp_236_var; while ( - (_tmp_235_var = _tmp_235_rule(p)) // ',' star_target + (_tmp_236_var = _tmp_236_rule(p)) // ',' star_target ) { - _res = _tmp_235_var; + _res = _tmp_236_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -32510,9 +32522,69 @@ _tmp_143_rule(Parser *p) return _res; } -// _tmp_144: args | expression for_if_clauses +// _tmp_144: +// | (','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs) +// | kwargs static void * _tmp_144_rule(Parser *p) +{ + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } + if (p->error_indicator) { + p->level--; + return NULL; + } + void * _res = NULL; + int _mark = p->mark; + { // (','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs) + if (p->error_indicator) { + p->level--; + return NULL; + } + D(fprintf(stderr, "%*c> _tmp_144[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs)")); + void *_tmp_237_var; + if ( + (_tmp_237_var = _tmp_237_rule(p)) // ','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs + ) + { + D(fprintf(stderr, "%*c+ _tmp_144[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs)")); + _res = _tmp_237_var; + goto done; + } + p->mark = _mark; + D(fprintf(stderr, "%*c%s _tmp_144[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs)")); + } + { // kwargs + if (p->error_indicator) { + p->level--; + return NULL; + } + D(fprintf(stderr, "%*c> _tmp_144[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "kwargs")); + asdl_seq* kwargs_var; + if ( + (kwargs_var = kwargs_rule(p)) // kwargs + ) + { + D(fprintf(stderr, "%*c+ _tmp_144[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "kwargs")); + _res = kwargs_var; + goto done; + } + p->mark = _mark; + D(fprintf(stderr, "%*c%s _tmp_144[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "kwargs")); + } + _res = NULL; + done: + p->level--; + return _res; +} + +// _tmp_145: args | expression for_if_clauses +static void * +_tmp_145_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -32529,18 +32601,18 @@ _tmp_144_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_144[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "args")); + D(fprintf(stderr, "%*c> _tmp_145[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "args")); expr_ty args_var; if ( (args_var = args_rule(p)) // args ) { - D(fprintf(stderr, "%*c+ _tmp_144[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "args")); + D(fprintf(stderr, "%*c+ _tmp_145[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "args")); _res = args_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_144[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_145[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "args")); } { // expression for_if_clauses @@ -32548,7 +32620,7 @@ _tmp_144_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_144[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression for_if_clauses")); + D(fprintf(stderr, "%*c> _tmp_145[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression for_if_clauses")); expr_ty expression_var; asdl_comprehension_seq* for_if_clauses_var; if ( @@ -32557,12 +32629,12 @@ _tmp_144_rule(Parser *p) (for_if_clauses_var = for_if_clauses_rule(p)) // for_if_clauses ) { - D(fprintf(stderr, "%*c+ _tmp_144[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression for_if_clauses")); + D(fprintf(stderr, "%*c+ _tmp_145[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression for_if_clauses")); _res = _PyPegen_dummy_name(p, expression_var, for_if_clauses_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_144[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_145[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "expression for_if_clauses")); } _res = NULL; @@ -32571,9 +32643,9 @@ _tmp_144_rule(Parser *p) return _res; } -// _tmp_145: 'True' | 'False' | 'None' +// _tmp_146: 'True' | 'False' | 'None' static void * -_tmp_145_rule(Parser *p) +_tmp_146_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -32590,18 +32662,18 @@ _tmp_145_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_145[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'True'")); + D(fprintf(stderr, "%*c> _tmp_146[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'True'")); Token * _keyword; if ( (_keyword = _PyPegen_expect_token(p, 600)) // token='True' ) { - D(fprintf(stderr, "%*c+ _tmp_145[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'True'")); + D(fprintf(stderr, "%*c+ _tmp_146[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'True'")); _res = _keyword; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_145[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_146[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'True'")); } { // 'False' @@ -32609,18 +32681,18 @@ _tmp_145_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_145[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'False'")); + D(fprintf(stderr, "%*c> _tmp_146[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'False'")); Token * _keyword; if ( (_keyword = _PyPegen_expect_token(p, 602)) // token='False' ) { - D(fprintf(stderr, "%*c+ _tmp_145[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'False'")); + D(fprintf(stderr, "%*c+ _tmp_146[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'False'")); _res = _keyword; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_145[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_146[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'False'")); } { // 'None' @@ -32628,18 +32700,18 @@ _tmp_145_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_145[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'None'")); + D(fprintf(stderr, "%*c> _tmp_146[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'None'")); Token * _keyword; if ( (_keyword = _PyPegen_expect_token(p, 601)) // token='None' ) { - D(fprintf(stderr, "%*c+ _tmp_145[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'None'")); + D(fprintf(stderr, "%*c+ _tmp_146[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'None'")); _res = _keyword; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_145[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_146[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'None'")); } _res = NULL; @@ -32648,9 +32720,9 @@ _tmp_145_rule(Parser *p) return _res; } -// _tmp_146: NAME '=' +// _tmp_147: NAME '=' static void * -_tmp_146_rule(Parser *p) +_tmp_147_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -32667,7 +32739,7 @@ _tmp_146_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_146[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '='")); + D(fprintf(stderr, "%*c> _tmp_147[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME '='")); Token * _literal; expr_ty name_var; if ( @@ -32676,12 +32748,12 @@ _tmp_146_rule(Parser *p) (_literal = _PyPegen_expect_token(p, 22)) // token='=' ) { - D(fprintf(stderr, "%*c+ _tmp_146[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME '='")); + D(fprintf(stderr, "%*c+ _tmp_147[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME '='")); _res = _PyPegen_dummy_name(p, name_var, _literal); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_146[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_147[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "NAME '='")); } _res = NULL; @@ -32690,9 +32762,9 @@ _tmp_146_rule(Parser *p) return _res; } -// _tmp_147: NAME STRING | SOFT_KEYWORD +// _tmp_148: NAME STRING | SOFT_KEYWORD static void * -_tmp_147_rule(Parser *p) +_tmp_148_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -32709,7 +32781,7 @@ _tmp_147_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_147[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME STRING")); + D(fprintf(stderr, "%*c> _tmp_148[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NAME STRING")); expr_ty name_var; expr_ty string_var; if ( @@ -32718,12 +32790,12 @@ _tmp_147_rule(Parser *p) (string_var = _PyPegen_string_token(p)) // STRING ) { - D(fprintf(stderr, "%*c+ _tmp_147[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME STRING")); + D(fprintf(stderr, "%*c+ _tmp_148[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NAME STRING")); _res = _PyPegen_dummy_name(p, name_var, string_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_147[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_148[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "NAME STRING")); } { // SOFT_KEYWORD @@ -32731,18 +32803,18 @@ _tmp_147_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_147[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "SOFT_KEYWORD")); + D(fprintf(stderr, "%*c> _tmp_148[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "SOFT_KEYWORD")); expr_ty soft_keyword_var; if ( (soft_keyword_var = _PyPegen_soft_keyword_token(p)) // SOFT_KEYWORD ) { - D(fprintf(stderr, "%*c+ _tmp_147[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "SOFT_KEYWORD")); + D(fprintf(stderr, "%*c+ _tmp_148[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "SOFT_KEYWORD")); _res = soft_keyword_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_147[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_148[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "SOFT_KEYWORD")); } _res = NULL; @@ -32751,9 +32823,9 @@ _tmp_147_rule(Parser *p) return _res; } -// _tmp_148: 'else' | ':' +// _tmp_149: 'else' | ':' static void * -_tmp_148_rule(Parser *p) +_tmp_149_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -32770,18 +32842,18 @@ _tmp_148_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_148[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'else'")); + D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'else'")); Token * _keyword; if ( (_keyword = _PyPegen_expect_token(p, 642)) // token='else' ) { - D(fprintf(stderr, "%*c+ _tmp_148[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'else'")); + D(fprintf(stderr, "%*c+ _tmp_149[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'else'")); _res = _keyword; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_148[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_149[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'else'")); } { // ':' @@ -32789,18 +32861,18 @@ _tmp_148_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_148[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); + D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 11)) // token=':' ) { - D(fprintf(stderr, "%*c+ _tmp_148[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':'")); + D(fprintf(stderr, "%*c+ _tmp_149[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_148[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_149[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "':'")); } _res = NULL; @@ -32809,9 +32881,9 @@ _tmp_148_rule(Parser *p) return _res; } -// _tmp_149: '=' | ':=' +// _tmp_150: '=' | ':=' static void * -_tmp_149_rule(Parser *p) +_tmp_150_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -32828,18 +32900,18 @@ _tmp_149_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'='")); + D(fprintf(stderr, "%*c> _tmp_150[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'='")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 22)) // token='=' ) { - D(fprintf(stderr, "%*c+ _tmp_149[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'='")); + D(fprintf(stderr, "%*c+ _tmp_150[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'='")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_149[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_150[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'='")); } { // ':=' @@ -32847,18 +32919,18 @@ _tmp_149_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_149[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':='")); + D(fprintf(stderr, "%*c> _tmp_150[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':='")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 53)) // token=':=' ) { - D(fprintf(stderr, "%*c+ _tmp_149[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':='")); + D(fprintf(stderr, "%*c+ _tmp_150[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':='")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_149[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_150[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "':='")); } _res = NULL; @@ -32867,9 +32939,9 @@ _tmp_149_rule(Parser *p) return _res; } -// _tmp_150: list | tuple | genexp | 'True' | 'None' | 'False' +// _tmp_151: list | tuple | genexp | 'True' | 'None' | 'False' static void * -_tmp_150_rule(Parser *p) +_tmp_151_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -32886,18 +32958,18 @@ _tmp_150_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_150[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "list")); + D(fprintf(stderr, "%*c> _tmp_151[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "list")); expr_ty list_var; if ( (list_var = list_rule(p)) // list ) { - D(fprintf(stderr, "%*c+ _tmp_150[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "list")); + D(fprintf(stderr, "%*c+ _tmp_151[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "list")); _res = list_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_150[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_151[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "list")); } { // tuple @@ -32905,18 +32977,18 @@ _tmp_150_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_150[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "tuple")); + D(fprintf(stderr, "%*c> _tmp_151[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "tuple")); expr_ty tuple_var; if ( (tuple_var = tuple_rule(p)) // tuple ) { - D(fprintf(stderr, "%*c+ _tmp_150[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "tuple")); + D(fprintf(stderr, "%*c+ _tmp_151[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "tuple")); _res = tuple_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_150[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_151[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "tuple")); } { // genexp @@ -32924,18 +32996,18 @@ _tmp_150_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_150[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "genexp")); + D(fprintf(stderr, "%*c> _tmp_151[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "genexp")); expr_ty genexp_var; if ( (genexp_var = genexp_rule(p)) // genexp ) { - D(fprintf(stderr, "%*c+ _tmp_150[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "genexp")); + D(fprintf(stderr, "%*c+ _tmp_151[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "genexp")); _res = genexp_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_150[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_151[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "genexp")); } { // 'True' @@ -32943,18 +33015,18 @@ _tmp_150_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_150[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'True'")); + D(fprintf(stderr, "%*c> _tmp_151[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'True'")); Token * _keyword; if ( (_keyword = _PyPegen_expect_token(p, 600)) // token='True' ) { - D(fprintf(stderr, "%*c+ _tmp_150[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'True'")); + D(fprintf(stderr, "%*c+ _tmp_151[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'True'")); _res = _keyword; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_150[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_151[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'True'")); } { // 'None' @@ -32962,18 +33034,18 @@ _tmp_150_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_150[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'None'")); + D(fprintf(stderr, "%*c> _tmp_151[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'None'")); Token * _keyword; if ( (_keyword = _PyPegen_expect_token(p, 601)) // token='None' ) { - D(fprintf(stderr, "%*c+ _tmp_150[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'None'")); + D(fprintf(stderr, "%*c+ _tmp_151[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'None'")); _res = _keyword; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_150[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_151[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'None'")); } { // 'False' @@ -32981,18 +33053,18 @@ _tmp_150_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_150[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'False'")); + D(fprintf(stderr, "%*c> _tmp_151[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'False'")); Token * _keyword; if ( (_keyword = _PyPegen_expect_token(p, 602)) // token='False' ) { - D(fprintf(stderr, "%*c+ _tmp_150[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'False'")); + D(fprintf(stderr, "%*c+ _tmp_151[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'False'")); _res = _keyword; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_150[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_151[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'False'")); } _res = NULL; @@ -33001,9 +33073,9 @@ _tmp_150_rule(Parser *p) return _res; } -// _tmp_151: '=' | ':=' +// _tmp_152: '=' | ':=' static void * -_tmp_151_rule(Parser *p) +_tmp_152_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33020,18 +33092,18 @@ _tmp_151_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_151[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'='")); + D(fprintf(stderr, "%*c> _tmp_152[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'='")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 22)) // token='=' ) { - D(fprintf(stderr, "%*c+ _tmp_151[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'='")); + D(fprintf(stderr, "%*c+ _tmp_152[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'='")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_151[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_152[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'='")); } { // ':=' @@ -33039,18 +33111,18 @@ _tmp_151_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_151[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':='")); + D(fprintf(stderr, "%*c> _tmp_152[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':='")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 53)) // token=':=' ) { - D(fprintf(stderr, "%*c+ _tmp_151[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':='")); + D(fprintf(stderr, "%*c+ _tmp_152[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':='")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_151[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_152[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "':='")); } _res = NULL; @@ -33059,9 +33131,9 @@ _tmp_151_rule(Parser *p) return _res; } -// _loop0_152: star_named_expressions +// _loop0_153: star_named_expressions static asdl_seq * -_loop0_152_rule(Parser *p) +_loop0_153_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33087,7 +33159,7 @@ _loop0_152_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_152[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_named_expressions")); + D(fprintf(stderr, "%*c> _loop0_153[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_named_expressions")); asdl_expr_seq* star_named_expressions_var; while ( (star_named_expressions_var = star_named_expressions_rule(p)) // star_named_expressions @@ -33110,7 +33182,7 @@ _loop0_152_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_152[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_153[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "star_named_expressions")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -33127,9 +33199,9 @@ _loop0_152_rule(Parser *p) return _seq; } -// _loop0_153: (star_targets '=') +// _loop0_154: (star_targets '=') static asdl_seq * -_loop0_153_rule(Parser *p) +_loop0_154_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33155,13 +33227,13 @@ _loop0_153_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_153[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(star_targets '=')")); - void *_tmp_236_var; + D(fprintf(stderr, "%*c> _loop0_154[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(star_targets '=')")); + void *_tmp_238_var; while ( - (_tmp_236_var = _tmp_236_rule(p)) // star_targets '=' + (_tmp_238_var = _tmp_238_rule(p)) // star_targets '=' ) { - _res = _tmp_236_var; + _res = _tmp_238_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -33178,7 +33250,7 @@ _loop0_153_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_153[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_154[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(star_targets '=')")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -33195,9 +33267,9 @@ _loop0_153_rule(Parser *p) return _seq; } -// _loop0_154: (star_targets '=') +// _loop0_155: (star_targets '=') static asdl_seq * -_loop0_154_rule(Parser *p) +_loop0_155_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33223,13 +33295,13 @@ _loop0_154_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_154[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(star_targets '=')")); - void *_tmp_237_var; + D(fprintf(stderr, "%*c> _loop0_155[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(star_targets '=')")); + void *_tmp_239_var; while ( - (_tmp_237_var = _tmp_237_rule(p)) // star_targets '=' + (_tmp_239_var = _tmp_239_rule(p)) // star_targets '=' ) { - _res = _tmp_237_var; + _res = _tmp_239_var; if (_n == _children_capacity) { _children_capacity *= 2; void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); @@ -33246,7 +33318,7 @@ _loop0_154_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_154[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_155[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(star_targets '=')")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -33263,9 +33335,9 @@ _loop0_154_rule(Parser *p) return _seq; } -// _tmp_155: yield_expr | star_expressions +// _tmp_156: yield_expr | star_expressions static void * -_tmp_155_rule(Parser *p) +_tmp_156_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33282,18 +33354,18 @@ _tmp_155_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_155[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "yield_expr")); + D(fprintf(stderr, "%*c> _tmp_156[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "yield_expr")); expr_ty yield_expr_var; if ( (yield_expr_var = yield_expr_rule(p)) // yield_expr ) { - D(fprintf(stderr, "%*c+ _tmp_155[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "yield_expr")); + D(fprintf(stderr, "%*c+ _tmp_156[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "yield_expr")); _res = yield_expr_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_155[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_156[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "yield_expr")); } { // star_expressions @@ -33301,18 +33373,18 @@ _tmp_155_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_155[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions")); + D(fprintf(stderr, "%*c> _tmp_156[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_expressions")); expr_ty star_expressions_var; if ( (star_expressions_var = star_expressions_rule(p)) // star_expressions ) { - D(fprintf(stderr, "%*c+ _tmp_155[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_expressions")); + D(fprintf(stderr, "%*c+ _tmp_156[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_expressions")); _res = star_expressions_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_155[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_156[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "star_expressions")); } _res = NULL; @@ -33321,9 +33393,9 @@ _tmp_155_rule(Parser *p) return _res; } -// _tmp_156: '[' | '(' | '{' +// _tmp_157: '[' | '(' | '{' static void * -_tmp_156_rule(Parser *p) +_tmp_157_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33340,18 +33412,18 @@ _tmp_156_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_156[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'['")); + D(fprintf(stderr, "%*c> _tmp_157[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'['")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 9)) // token='[' ) { - D(fprintf(stderr, "%*c+ _tmp_156[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'['")); + D(fprintf(stderr, "%*c+ _tmp_157[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'['")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_156[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_157[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'['")); } { // '(' @@ -33359,18 +33431,18 @@ _tmp_156_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_156[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'('")); + D(fprintf(stderr, "%*c> _tmp_157[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'('")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 7)) // token='(' ) { - D(fprintf(stderr, "%*c+ _tmp_156[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'('")); + D(fprintf(stderr, "%*c+ _tmp_157[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'('")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_156[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_157[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'('")); } { // '{' @@ -33378,18 +33450,18 @@ _tmp_156_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_156[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{'")); + D(fprintf(stderr, "%*c> _tmp_157[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 25)) // token='{' ) { - D(fprintf(stderr, "%*c+ _tmp_156[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{'")); + D(fprintf(stderr, "%*c+ _tmp_157[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_156[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_157[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'{'")); } _res = NULL; @@ -33398,9 +33470,9 @@ _tmp_156_rule(Parser *p) return _res; } -// _tmp_157: '[' | '{' +// _tmp_158: '[' | '{' static void * -_tmp_157_rule(Parser *p) +_tmp_158_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33417,18 +33489,18 @@ _tmp_157_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_157[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'['")); + D(fprintf(stderr, "%*c> _tmp_158[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'['")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 9)) // token='[' ) { - D(fprintf(stderr, "%*c+ _tmp_157[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'['")); + D(fprintf(stderr, "%*c+ _tmp_158[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'['")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_157[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_158[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'['")); } { // '{' @@ -33436,18 +33508,18 @@ _tmp_157_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_157[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{'")); + D(fprintf(stderr, "%*c> _tmp_158[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 25)) // token='{' ) { - D(fprintf(stderr, "%*c+ _tmp_157[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{'")); + D(fprintf(stderr, "%*c+ _tmp_158[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_157[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_158[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'{'")); } _res = NULL; @@ -33456,9 +33528,9 @@ _tmp_157_rule(Parser *p) return _res; } -// _tmp_158: '[' | '{' +// _tmp_159: '[' | '{' static void * -_tmp_158_rule(Parser *p) +_tmp_159_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33475,18 +33547,18 @@ _tmp_158_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_158[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'['")); + D(fprintf(stderr, "%*c> _tmp_159[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'['")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 9)) // token='[' ) { - D(fprintf(stderr, "%*c+ _tmp_158[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'['")); + D(fprintf(stderr, "%*c+ _tmp_159[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'['")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_158[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_159[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'['")); } { // '{' @@ -33494,18 +33566,18 @@ _tmp_158_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_158[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{'")); + D(fprintf(stderr, "%*c> _tmp_159[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'{'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 25)) // token='{' ) { - D(fprintf(stderr, "%*c+ _tmp_158[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{'")); + D(fprintf(stderr, "%*c+ _tmp_159[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'{'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_158[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_159[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'{'")); } _res = NULL; @@ -33514,9 +33586,9 @@ _tmp_158_rule(Parser *p) return _res; } -// _loop0_159: param_no_default +// _loop0_160: param_no_default static asdl_seq * -_loop0_159_rule(Parser *p) +_loop0_160_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33542,7 +33614,7 @@ _loop0_159_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_159[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); + D(fprintf(stderr, "%*c> _loop0_160[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); arg_ty param_no_default_var; while ( (param_no_default_var = param_no_default_rule(p)) // param_no_default @@ -33565,7 +33637,7 @@ _loop0_159_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_159[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_160[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "param_no_default")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -33582,9 +33654,9 @@ _loop0_159_rule(Parser *p) return _seq; } -// _loop0_160: param_no_default +// _loop0_161: param_no_default static asdl_seq * -_loop0_160_rule(Parser *p) +_loop0_161_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33610,7 +33682,7 @@ _loop0_160_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_160[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); + D(fprintf(stderr, "%*c> _loop0_161[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); arg_ty param_no_default_var; while ( (param_no_default_var = param_no_default_rule(p)) // param_no_default @@ -33633,7 +33705,7 @@ _loop0_160_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_160[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_161[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "param_no_default")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -33650,9 +33722,9 @@ _loop0_160_rule(Parser *p) return _seq; } -// _loop1_161: param_no_default +// _loop1_162: param_no_default static asdl_seq * -_loop1_161_rule(Parser *p) +_loop1_162_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33678,7 +33750,7 @@ _loop1_161_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop1_161[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); + D(fprintf(stderr, "%*c> _loop1_162[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); arg_ty param_no_default_var; while ( (param_no_default_var = param_no_default_rule(p)) // param_no_default @@ -33701,7 +33773,7 @@ _loop1_161_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop1_161[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop1_162[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "param_no_default")); } if (_n == 0 || p->error_indicator) { @@ -33723,9 +33795,9 @@ _loop1_161_rule(Parser *p) return _seq; } -// _tmp_162: slash_no_default | slash_with_default +// _tmp_163: slash_no_default | slash_with_default static void * -_tmp_162_rule(Parser *p) +_tmp_163_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33742,18 +33814,18 @@ _tmp_162_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_162[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slash_no_default")); + D(fprintf(stderr, "%*c> _tmp_163[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slash_no_default")); asdl_arg_seq* slash_no_default_var; if ( (slash_no_default_var = slash_no_default_rule(p)) // slash_no_default ) { - D(fprintf(stderr, "%*c+ _tmp_162[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "slash_no_default")); + D(fprintf(stderr, "%*c+ _tmp_163[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "slash_no_default")); _res = slash_no_default_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_162[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_163[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "slash_no_default")); } { // slash_with_default @@ -33761,18 +33833,18 @@ _tmp_162_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_162[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slash_with_default")); + D(fprintf(stderr, "%*c> _tmp_163[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slash_with_default")); SlashWithDefault* slash_with_default_var; if ( (slash_with_default_var = slash_with_default_rule(p)) // slash_with_default ) { - D(fprintf(stderr, "%*c+ _tmp_162[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "slash_with_default")); + D(fprintf(stderr, "%*c+ _tmp_163[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "slash_with_default")); _res = slash_with_default_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_162[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_163[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "slash_with_default")); } _res = NULL; @@ -33781,9 +33853,9 @@ _tmp_162_rule(Parser *p) return _res; } -// _loop0_163: param_maybe_default +// _loop0_164: param_maybe_default static asdl_seq * -_loop0_163_rule(Parser *p) +_loop0_164_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33809,7 +33881,7 @@ _loop0_163_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_163[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_maybe_default")); + D(fprintf(stderr, "%*c> _loop0_164[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_maybe_default")); NameDefaultPair* param_maybe_default_var; while ( (param_maybe_default_var = param_maybe_default_rule(p)) // param_maybe_default @@ -33832,7 +33904,7 @@ _loop0_163_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_163[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_164[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "param_maybe_default")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -33849,9 +33921,9 @@ _loop0_163_rule(Parser *p) return _seq; } -// _tmp_164: slash_no_default | slash_with_default +// _tmp_165: slash_no_default | slash_with_default static void * -_tmp_164_rule(Parser *p) +_tmp_165_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33868,18 +33940,18 @@ _tmp_164_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_164[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slash_no_default")); + D(fprintf(stderr, "%*c> _tmp_165[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slash_no_default")); asdl_arg_seq* slash_no_default_var; if ( (slash_no_default_var = slash_no_default_rule(p)) // slash_no_default ) { - D(fprintf(stderr, "%*c+ _tmp_164[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "slash_no_default")); + D(fprintf(stderr, "%*c+ _tmp_165[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "slash_no_default")); _res = slash_no_default_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_164[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_165[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "slash_no_default")); } { // slash_with_default @@ -33887,18 +33959,18 @@ _tmp_164_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_164[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slash_with_default")); + D(fprintf(stderr, "%*c> _tmp_165[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slash_with_default")); SlashWithDefault* slash_with_default_var; if ( (slash_with_default_var = slash_with_default_rule(p)) // slash_with_default ) { - D(fprintf(stderr, "%*c+ _tmp_164[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "slash_with_default")); + D(fprintf(stderr, "%*c+ _tmp_165[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "slash_with_default")); _res = slash_with_default_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_164[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_165[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "slash_with_default")); } _res = NULL; @@ -33907,9 +33979,9 @@ _tmp_164_rule(Parser *p) return _res; } -// _loop0_165: param_maybe_default +// _loop0_166: param_maybe_default static asdl_seq * -_loop0_165_rule(Parser *p) +_loop0_166_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33935,7 +34007,7 @@ _loop0_165_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_165[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_maybe_default")); + D(fprintf(stderr, "%*c> _loop0_166[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_maybe_default")); NameDefaultPair* param_maybe_default_var; while ( (param_maybe_default_var = param_maybe_default_rule(p)) // param_maybe_default @@ -33958,7 +34030,7 @@ _loop0_165_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_165[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_166[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "param_maybe_default")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -33975,9 +34047,9 @@ _loop0_165_rule(Parser *p) return _seq; } -// _tmp_166: ',' | param_no_default +// _tmp_167: ',' | param_no_default static void * -_tmp_166_rule(Parser *p) +_tmp_167_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -33994,18 +34066,18 @@ _tmp_166_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_166[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c> _tmp_167[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' ) { - D(fprintf(stderr, "%*c+ _tmp_166[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c+ _tmp_167[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_166[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_167[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "','")); } { // param_no_default @@ -34013,18 +34085,18 @@ _tmp_166_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_166[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); + D(fprintf(stderr, "%*c> _tmp_167[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); arg_ty param_no_default_var; if ( (param_no_default_var = param_no_default_rule(p)) // param_no_default ) { - D(fprintf(stderr, "%*c+ _tmp_166[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "param_no_default")); + D(fprintf(stderr, "%*c+ _tmp_167[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "param_no_default")); _res = param_no_default_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_166[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_167[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "param_no_default")); } _res = NULL; @@ -34033,9 +34105,9 @@ _tmp_166_rule(Parser *p) return _res; } -// _loop0_167: param_maybe_default +// _loop0_168: param_maybe_default static asdl_seq * -_loop0_167_rule(Parser *p) +_loop0_168_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -34061,7 +34133,7 @@ _loop0_167_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_167[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_maybe_default")); + D(fprintf(stderr, "%*c> _loop0_168[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_maybe_default")); NameDefaultPair* param_maybe_default_var; while ( (param_maybe_default_var = param_maybe_default_rule(p)) // param_maybe_default @@ -34084,7 +34156,7 @@ _loop0_167_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_167[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_168[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "param_maybe_default")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -34101,9 +34173,9 @@ _loop0_167_rule(Parser *p) return _seq; } -// _loop1_168: param_maybe_default +// _loop1_169: param_maybe_default static asdl_seq * -_loop1_168_rule(Parser *p) +_loop1_169_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -34129,7 +34201,7 @@ _loop1_168_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop1_168[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_maybe_default")); + D(fprintf(stderr, "%*c> _loop1_169[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_maybe_default")); NameDefaultPair* param_maybe_default_var; while ( (param_maybe_default_var = param_maybe_default_rule(p)) // param_maybe_default @@ -34152,7 +34224,7 @@ _loop1_168_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop1_168[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop1_169[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "param_maybe_default")); } if (_n == 0 || p->error_indicator) { @@ -34174,9 +34246,9 @@ _loop1_168_rule(Parser *p) return _seq; } -// _tmp_169: ')' | ',' +// _tmp_170: ')' | ',' static void * -_tmp_169_rule(Parser *p) +_tmp_170_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -34193,18 +34265,18 @@ _tmp_169_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_169[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); + D(fprintf(stderr, "%*c> _tmp_170[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 8)) // token=')' ) { - D(fprintf(stderr, "%*c+ _tmp_169[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "')'")); + D(fprintf(stderr, "%*c+ _tmp_170[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "')'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_169[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_170[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "')'")); } { // ',' @@ -34212,18 +34284,18 @@ _tmp_169_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_169[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c> _tmp_170[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' ) { - D(fprintf(stderr, "%*c+ _tmp_169[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c+ _tmp_170[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_169[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_170[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "','")); } _res = NULL; @@ -34232,9 +34304,9 @@ _tmp_169_rule(Parser *p) return _res; } -// _tmp_170: ')' | ',' (')' | '**') +// _tmp_171: ')' | ',' (')' | '**') static void * -_tmp_170_rule(Parser *p) +_tmp_171_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -34251,18 +34323,18 @@ _tmp_170_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_170[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); + D(fprintf(stderr, "%*c> _tmp_171[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 8)) // token=')' ) { - D(fprintf(stderr, "%*c+ _tmp_170[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "')'")); + D(fprintf(stderr, "%*c+ _tmp_171[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "')'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_170[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_171[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "')'")); } { // ',' (')' | '**') @@ -34270,21 +34342,21 @@ _tmp_170_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_170[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (')' | '**')")); + D(fprintf(stderr, "%*c> _tmp_171[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (')' | '**')")); Token * _literal; - void *_tmp_238_var; + void *_tmp_240_var; if ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (_tmp_238_var = _tmp_238_rule(p)) // ')' | '**' + (_tmp_240_var = _tmp_240_rule(p)) // ')' | '**' ) { - D(fprintf(stderr, "%*c+ _tmp_170[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' (')' | '**')")); - _res = _PyPegen_dummy_name(p, _literal, _tmp_238_var); + D(fprintf(stderr, "%*c+ _tmp_171[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' (')' | '**')")); + _res = _PyPegen_dummy_name(p, _literal, _tmp_240_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_170[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_171[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' (')' | '**')")); } _res = NULL; @@ -34293,9 +34365,9 @@ _tmp_170_rule(Parser *p) return _res; } -// _tmp_171: param_no_default | ',' +// _tmp_172: param_no_default | ',' static void * -_tmp_171_rule(Parser *p) +_tmp_172_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -34312,18 +34384,18 @@ _tmp_171_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_171[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); + D(fprintf(stderr, "%*c> _tmp_172[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); arg_ty param_no_default_var; if ( (param_no_default_var = param_no_default_rule(p)) // param_no_default ) { - D(fprintf(stderr, "%*c+ _tmp_171[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "param_no_default")); + D(fprintf(stderr, "%*c+ _tmp_172[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "param_no_default")); _res = param_no_default_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_171[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_172[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "param_no_default")); } { // ',' @@ -34331,18 +34403,18 @@ _tmp_171_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_171[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c> _tmp_172[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' ) { - D(fprintf(stderr, "%*c+ _tmp_171[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c+ _tmp_172[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_171[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_172[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "','")); } _res = NULL; @@ -34351,9 +34423,9 @@ _tmp_171_rule(Parser *p) return _res; } -// _loop0_172: param_maybe_default +// _loop0_173: param_maybe_default static asdl_seq * -_loop0_172_rule(Parser *p) +_loop0_173_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -34379,7 +34451,7 @@ _loop0_172_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_172[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_maybe_default")); + D(fprintf(stderr, "%*c> _loop0_173[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_maybe_default")); NameDefaultPair* param_maybe_default_var; while ( (param_maybe_default_var = param_maybe_default_rule(p)) // param_maybe_default @@ -34402,7 +34474,7 @@ _loop0_172_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_172[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_173[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "param_maybe_default")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -34419,9 +34491,9 @@ _loop0_172_rule(Parser *p) return _seq; } -// _tmp_173: param_no_default | ',' +// _tmp_174: param_no_default | ',' static void * -_tmp_173_rule(Parser *p) +_tmp_174_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -34438,18 +34510,18 @@ _tmp_173_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_173[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); + D(fprintf(stderr, "%*c> _tmp_174[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_no_default")); arg_ty param_no_default_var; if ( (param_no_default_var = param_no_default_rule(p)) // param_no_default ) { - D(fprintf(stderr, "%*c+ _tmp_173[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "param_no_default")); + D(fprintf(stderr, "%*c+ _tmp_174[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "param_no_default")); _res = param_no_default_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_173[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_174[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "param_no_default")); } { // ',' @@ -34457,18 +34529,18 @@ _tmp_173_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_173[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c> _tmp_174[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' ) { - D(fprintf(stderr, "%*c+ _tmp_173[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c+ _tmp_174[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_173[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_174[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "','")); } _res = NULL; @@ -34477,9 +34549,9 @@ _tmp_173_rule(Parser *p) return _res; } -// _tmp_174: '*' | '**' | '/' +// _tmp_175: '*' | '**' | '/' static void * -_tmp_174_rule(Parser *p) +_tmp_175_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -34496,18 +34568,18 @@ _tmp_174_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_174[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*'")); + D(fprintf(stderr, "%*c> _tmp_175[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 16)) // token='*' ) { - D(fprintf(stderr, "%*c+ _tmp_174[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*'")); + D(fprintf(stderr, "%*c+ _tmp_175[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_174[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_175[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'*'")); } { // '**' @@ -34515,18 +34587,18 @@ _tmp_174_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_174[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); + D(fprintf(stderr, "%*c> _tmp_175[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 35)) // token='**' ) { - D(fprintf(stderr, "%*c+ _tmp_174[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**'")); + D(fprintf(stderr, "%*c+ _tmp_175[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_174[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_175[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'**'")); } { // '/' @@ -34534,18 +34606,18 @@ _tmp_174_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_174[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'/'")); + D(fprintf(stderr, "%*c> _tmp_175[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'/'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 17)) // token='/' ) { - D(fprintf(stderr, "%*c+ _tmp_174[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'/'")); + D(fprintf(stderr, "%*c+ _tmp_175[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'/'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_174[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_175[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'/'")); } _res = NULL; @@ -34554,9 +34626,9 @@ _tmp_174_rule(Parser *p) return _res; } -// _loop1_175: param_with_default +// _loop1_176: param_with_default static asdl_seq * -_loop1_175_rule(Parser *p) +_loop1_176_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -34582,7 +34654,7 @@ _loop1_175_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop1_175[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); + D(fprintf(stderr, "%*c> _loop1_176[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "param_with_default")); NameDefaultPair* param_with_default_var; while ( (param_with_default_var = param_with_default_rule(p)) // param_with_default @@ -34605,7 +34677,7 @@ _loop1_175_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop1_175[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop1_176[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "param_with_default")); } if (_n == 0 || p->error_indicator) { @@ -34627,9 +34699,9 @@ _loop1_175_rule(Parser *p) return _seq; } -// _loop0_176: lambda_param_no_default +// _loop0_177: lambda_param_no_default static asdl_seq * -_loop0_176_rule(Parser *p) +_loop0_177_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -34655,7 +34727,7 @@ _loop0_176_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_176[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); + D(fprintf(stderr, "%*c> _loop0_177[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); arg_ty lambda_param_no_default_var; while ( (lambda_param_no_default_var = lambda_param_no_default_rule(p)) // lambda_param_no_default @@ -34678,7 +34750,7 @@ _loop0_176_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_176[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_177[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_param_no_default")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -34695,9 +34767,9 @@ _loop0_176_rule(Parser *p) return _seq; } -// _loop0_177: lambda_param_no_default +// _loop0_178: lambda_param_no_default static asdl_seq * -_loop0_177_rule(Parser *p) +_loop0_178_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -34723,7 +34795,7 @@ _loop0_177_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_177[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); + D(fprintf(stderr, "%*c> _loop0_178[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); arg_ty lambda_param_no_default_var; while ( (lambda_param_no_default_var = lambda_param_no_default_rule(p)) // lambda_param_no_default @@ -34746,7 +34818,7 @@ _loop0_177_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_177[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_178[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_param_no_default")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -34763,9 +34835,9 @@ _loop0_177_rule(Parser *p) return _seq; } -// _loop0_179: ',' lambda_param +// _loop0_180: ',' lambda_param static asdl_seq * -_loop0_179_rule(Parser *p) +_loop0_180_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -34791,7 +34863,7 @@ _loop0_179_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_179[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' lambda_param")); + D(fprintf(stderr, "%*c> _loop0_180[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' lambda_param")); Token * _literal; arg_ty elem; while ( @@ -34823,7 +34895,7 @@ _loop0_179_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_179[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_180[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' lambda_param")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -34840,9 +34912,9 @@ _loop0_179_rule(Parser *p) return _seq; } -// _gather_178: lambda_param _loop0_179 +// _gather_179: lambda_param _loop0_180 static asdl_seq * -_gather_178_rule(Parser *p) +_gather_179_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -34854,27 +34926,27 @@ _gather_178_rule(Parser *p) } asdl_seq * _res = NULL; int _mark = p->mark; - { // lambda_param _loop0_179 + { // lambda_param _loop0_180 if (p->error_indicator) { p->level--; return NULL; } - D(fprintf(stderr, "%*c> _gather_178[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param _loop0_179")); + D(fprintf(stderr, "%*c> _gather_179[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param _loop0_180")); arg_ty elem; asdl_seq * seq; if ( (elem = lambda_param_rule(p)) // lambda_param && - (seq = _loop0_179_rule(p)) // _loop0_179 + (seq = _loop0_180_rule(p)) // _loop0_180 ) { - D(fprintf(stderr, "%*c+ _gather_178[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_param _loop0_179")); + D(fprintf(stderr, "%*c+ _gather_179[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_param _loop0_180")); _res = _PyPegen_seq_insert_in_front(p, elem, seq); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _gather_178[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_param _loop0_179")); + D(fprintf(stderr, "%*c%s _gather_179[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_param _loop0_180")); } _res = NULL; done: @@ -34882,9 +34954,9 @@ _gather_178_rule(Parser *p) return _res; } -// _tmp_180: lambda_slash_no_default | lambda_slash_with_default +// _tmp_181: lambda_slash_no_default | lambda_slash_with_default static void * -_tmp_180_rule(Parser *p) +_tmp_181_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -34901,18 +34973,18 @@ _tmp_180_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_180[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_slash_no_default")); + D(fprintf(stderr, "%*c> _tmp_181[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_slash_no_default")); asdl_arg_seq* lambda_slash_no_default_var; if ( (lambda_slash_no_default_var = lambda_slash_no_default_rule(p)) // lambda_slash_no_default ) { - D(fprintf(stderr, "%*c+ _tmp_180[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_slash_no_default")); + D(fprintf(stderr, "%*c+ _tmp_181[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_slash_no_default")); _res = lambda_slash_no_default_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_180[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_181[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_slash_no_default")); } { // lambda_slash_with_default @@ -34920,18 +34992,18 @@ _tmp_180_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_180[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_slash_with_default")); + D(fprintf(stderr, "%*c> _tmp_181[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_slash_with_default")); SlashWithDefault* lambda_slash_with_default_var; if ( (lambda_slash_with_default_var = lambda_slash_with_default_rule(p)) // lambda_slash_with_default ) { - D(fprintf(stderr, "%*c+ _tmp_180[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_slash_with_default")); + D(fprintf(stderr, "%*c+ _tmp_181[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_slash_with_default")); _res = lambda_slash_with_default_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_180[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_181[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_slash_with_default")); } _res = NULL; @@ -34940,9 +35012,9 @@ _tmp_180_rule(Parser *p) return _res; } -// _loop0_181: lambda_param_maybe_default +// _loop0_182: lambda_param_maybe_default static asdl_seq * -_loop0_181_rule(Parser *p) +_loop0_182_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -34968,7 +35040,7 @@ _loop0_181_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_181[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_maybe_default")); + D(fprintf(stderr, "%*c> _loop0_182[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_maybe_default")); NameDefaultPair* lambda_param_maybe_default_var; while ( (lambda_param_maybe_default_var = lambda_param_maybe_default_rule(p)) // lambda_param_maybe_default @@ -34991,7 +35063,7 @@ _loop0_181_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_181[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_182[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_param_maybe_default")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -35008,9 +35080,9 @@ _loop0_181_rule(Parser *p) return _seq; } -// _tmp_182: lambda_slash_no_default | lambda_slash_with_default +// _tmp_183: lambda_slash_no_default | lambda_slash_with_default static void * -_tmp_182_rule(Parser *p) +_tmp_183_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -35027,18 +35099,18 @@ _tmp_182_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_182[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_slash_no_default")); + D(fprintf(stderr, "%*c> _tmp_183[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_slash_no_default")); asdl_arg_seq* lambda_slash_no_default_var; if ( (lambda_slash_no_default_var = lambda_slash_no_default_rule(p)) // lambda_slash_no_default ) { - D(fprintf(stderr, "%*c+ _tmp_182[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_slash_no_default")); + D(fprintf(stderr, "%*c+ _tmp_183[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_slash_no_default")); _res = lambda_slash_no_default_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_182[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_183[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_slash_no_default")); } { // lambda_slash_with_default @@ -35046,18 +35118,18 @@ _tmp_182_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_182[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_slash_with_default")); + D(fprintf(stderr, "%*c> _tmp_183[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_slash_with_default")); SlashWithDefault* lambda_slash_with_default_var; if ( (lambda_slash_with_default_var = lambda_slash_with_default_rule(p)) // lambda_slash_with_default ) { - D(fprintf(stderr, "%*c+ _tmp_182[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_slash_with_default")); + D(fprintf(stderr, "%*c+ _tmp_183[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_slash_with_default")); _res = lambda_slash_with_default_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_182[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_183[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_slash_with_default")); } _res = NULL; @@ -35066,9 +35138,9 @@ _tmp_182_rule(Parser *p) return _res; } -// _loop0_183: lambda_param_maybe_default +// _loop0_184: lambda_param_maybe_default static asdl_seq * -_loop0_183_rule(Parser *p) +_loop0_184_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -35094,7 +35166,7 @@ _loop0_183_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_183[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_maybe_default")); + D(fprintf(stderr, "%*c> _loop0_184[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_maybe_default")); NameDefaultPair* lambda_param_maybe_default_var; while ( (lambda_param_maybe_default_var = lambda_param_maybe_default_rule(p)) // lambda_param_maybe_default @@ -35117,7 +35189,7 @@ _loop0_183_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_183[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_184[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_param_maybe_default")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -35134,9 +35206,9 @@ _loop0_183_rule(Parser *p) return _seq; } -// _tmp_184: ',' | lambda_param_no_default +// _tmp_185: ',' | lambda_param_no_default static void * -_tmp_184_rule(Parser *p) +_tmp_185_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -35153,18 +35225,18 @@ _tmp_184_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_184[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c> _tmp_185[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' ) { - D(fprintf(stderr, "%*c+ _tmp_184[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c+ _tmp_185[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_184[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_185[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "','")); } { // lambda_param_no_default @@ -35172,18 +35244,18 @@ _tmp_184_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_184[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); + D(fprintf(stderr, "%*c> _tmp_185[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); arg_ty lambda_param_no_default_var; if ( (lambda_param_no_default_var = lambda_param_no_default_rule(p)) // lambda_param_no_default ) { - D(fprintf(stderr, "%*c+ _tmp_184[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); + D(fprintf(stderr, "%*c+ _tmp_185[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); _res = lambda_param_no_default_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_184[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_185[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_param_no_default")); } _res = NULL; @@ -35192,9 +35264,9 @@ _tmp_184_rule(Parser *p) return _res; } -// _loop0_185: lambda_param_maybe_default +// _loop0_186: lambda_param_maybe_default static asdl_seq * -_loop0_185_rule(Parser *p) +_loop0_186_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -35220,7 +35292,7 @@ _loop0_185_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_185[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_maybe_default")); + D(fprintf(stderr, "%*c> _loop0_186[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_maybe_default")); NameDefaultPair* lambda_param_maybe_default_var; while ( (lambda_param_maybe_default_var = lambda_param_maybe_default_rule(p)) // lambda_param_maybe_default @@ -35243,7 +35315,7 @@ _loop0_185_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_185[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_186[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_param_maybe_default")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -35260,9 +35332,9 @@ _loop0_185_rule(Parser *p) return _seq; } -// _loop1_186: lambda_param_maybe_default +// _loop1_187: lambda_param_maybe_default static asdl_seq * -_loop1_186_rule(Parser *p) +_loop1_187_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -35288,7 +35360,7 @@ _loop1_186_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop1_186[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_maybe_default")); + D(fprintf(stderr, "%*c> _loop1_187[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_maybe_default")); NameDefaultPair* lambda_param_maybe_default_var; while ( (lambda_param_maybe_default_var = lambda_param_maybe_default_rule(p)) // lambda_param_maybe_default @@ -35311,7 +35383,7 @@ _loop1_186_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop1_186[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop1_187[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_param_maybe_default")); } if (_n == 0 || p->error_indicator) { @@ -35333,9 +35405,9 @@ _loop1_186_rule(Parser *p) return _seq; } -// _loop1_187: lambda_param_with_default +// _loop1_188: lambda_param_with_default static asdl_seq * -_loop1_187_rule(Parser *p) +_loop1_188_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -35361,7 +35433,7 @@ _loop1_187_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop1_187[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); + D(fprintf(stderr, "%*c> _loop1_188[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_with_default")); NameDefaultPair* lambda_param_with_default_var; while ( (lambda_param_with_default_var = lambda_param_with_default_rule(p)) // lambda_param_with_default @@ -35384,7 +35456,7 @@ _loop1_187_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop1_187[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop1_188[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_param_with_default")); } if (_n == 0 || p->error_indicator) { @@ -35406,9 +35478,9 @@ _loop1_187_rule(Parser *p) return _seq; } -// _tmp_188: ':' | ',' (':' | '**') +// _tmp_189: ':' | ',' (':' | '**') static void * -_tmp_188_rule(Parser *p) +_tmp_189_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -35425,18 +35497,18 @@ _tmp_188_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_188[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); + D(fprintf(stderr, "%*c> _tmp_189[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 11)) // token=':' ) { - D(fprintf(stderr, "%*c+ _tmp_188[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':'")); + D(fprintf(stderr, "%*c+ _tmp_189[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_188[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_189[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "':'")); } { // ',' (':' | '**') @@ -35444,21 +35516,21 @@ _tmp_188_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_188[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (':' | '**')")); + D(fprintf(stderr, "%*c> _tmp_189[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (':' | '**')")); Token * _literal; - void *_tmp_239_var; + void *_tmp_241_var; if ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (_tmp_239_var = _tmp_239_rule(p)) // ':' | '**' + (_tmp_241_var = _tmp_241_rule(p)) // ':' | '**' ) { - D(fprintf(stderr, "%*c+ _tmp_188[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' (':' | '**')")); - _res = _PyPegen_dummy_name(p, _literal, _tmp_239_var); + D(fprintf(stderr, "%*c+ _tmp_189[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' (':' | '**')")); + _res = _PyPegen_dummy_name(p, _literal, _tmp_241_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_188[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_189[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' (':' | '**')")); } _res = NULL; @@ -35467,9 +35539,9 @@ _tmp_188_rule(Parser *p) return _res; } -// _tmp_189: lambda_param_no_default | ',' +// _tmp_190: lambda_param_no_default | ',' static void * -_tmp_189_rule(Parser *p) +_tmp_190_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -35486,18 +35558,18 @@ _tmp_189_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_189[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); + D(fprintf(stderr, "%*c> _tmp_190[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); arg_ty lambda_param_no_default_var; if ( (lambda_param_no_default_var = lambda_param_no_default_rule(p)) // lambda_param_no_default ) { - D(fprintf(stderr, "%*c+ _tmp_189[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); + D(fprintf(stderr, "%*c+ _tmp_190[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); _res = lambda_param_no_default_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_189[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_190[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_param_no_default")); } { // ',' @@ -35505,18 +35577,18 @@ _tmp_189_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_189[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c> _tmp_190[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' ) { - D(fprintf(stderr, "%*c+ _tmp_189[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c+ _tmp_190[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_189[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_190[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "','")); } _res = NULL; @@ -35525,9 +35597,9 @@ _tmp_189_rule(Parser *p) return _res; } -// _loop0_190: lambda_param_maybe_default +// _loop0_191: lambda_param_maybe_default static asdl_seq * -_loop0_190_rule(Parser *p) +_loop0_191_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -35553,7 +35625,7 @@ _loop0_190_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_190[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_maybe_default")); + D(fprintf(stderr, "%*c> _loop0_191[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_maybe_default")); NameDefaultPair* lambda_param_maybe_default_var; while ( (lambda_param_maybe_default_var = lambda_param_maybe_default_rule(p)) // lambda_param_maybe_default @@ -35576,7 +35648,7 @@ _loop0_190_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_190[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_191[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_param_maybe_default")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -35593,9 +35665,9 @@ _loop0_190_rule(Parser *p) return _seq; } -// _tmp_191: lambda_param_no_default | ',' +// _tmp_192: lambda_param_no_default | ',' static void * -_tmp_191_rule(Parser *p) +_tmp_192_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -35612,18 +35684,18 @@ _tmp_191_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_191[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); + D(fprintf(stderr, "%*c> _tmp_192[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); arg_ty lambda_param_no_default_var; if ( (lambda_param_no_default_var = lambda_param_no_default_rule(p)) // lambda_param_no_default ) { - D(fprintf(stderr, "%*c+ _tmp_191[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); + D(fprintf(stderr, "%*c+ _tmp_192[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "lambda_param_no_default")); _res = lambda_param_no_default_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_191[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_192[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "lambda_param_no_default")); } { // ',' @@ -35631,18 +35703,18 @@ _tmp_191_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_191[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c> _tmp_192[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' ) { - D(fprintf(stderr, "%*c+ _tmp_191[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c+ _tmp_192[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_191[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_192[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "','")); } _res = NULL; @@ -35651,9 +35723,9 @@ _tmp_191_rule(Parser *p) return _res; } -// _tmp_192: '*' | '**' | '/' +// _tmp_193: '*' | '**' | '/' static void * -_tmp_192_rule(Parser *p) +_tmp_193_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -35670,18 +35742,18 @@ _tmp_192_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_192[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*'")); + D(fprintf(stderr, "%*c> _tmp_193[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'*'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 16)) // token='*' ) { - D(fprintf(stderr, "%*c+ _tmp_192[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*'")); + D(fprintf(stderr, "%*c+ _tmp_193[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'*'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_192[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_193[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'*'")); } { // '**' @@ -35689,18 +35761,18 @@ _tmp_192_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_192[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); + D(fprintf(stderr, "%*c> _tmp_193[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 35)) // token='**' ) { - D(fprintf(stderr, "%*c+ _tmp_192[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**'")); + D(fprintf(stderr, "%*c+ _tmp_193[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_192[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_193[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'**'")); } { // '/' @@ -35708,18 +35780,18 @@ _tmp_192_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_192[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'/'")); + D(fprintf(stderr, "%*c> _tmp_193[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'/'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 17)) // token='/' ) { - D(fprintf(stderr, "%*c+ _tmp_192[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'/'")); + D(fprintf(stderr, "%*c+ _tmp_193[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'/'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_192[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_193[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'/'")); } _res = NULL; @@ -35728,9 +35800,9 @@ _tmp_192_rule(Parser *p) return _res; } -// _tmp_193: ',' | ')' | ':' +// _tmp_194: ',' | ')' | ':' static void * -_tmp_193_rule(Parser *p) +_tmp_194_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -35747,18 +35819,18 @@ _tmp_193_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_193[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c> _tmp_194[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' ) { - D(fprintf(stderr, "%*c+ _tmp_193[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c+ _tmp_194[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_193[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_194[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "','")); } { // ')' @@ -35766,18 +35838,18 @@ _tmp_193_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_193[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); + D(fprintf(stderr, "%*c> _tmp_194[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 8)) // token=')' ) { - D(fprintf(stderr, "%*c+ _tmp_193[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "')'")); + D(fprintf(stderr, "%*c+ _tmp_194[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "')'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_193[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_194[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "')'")); } { // ':' @@ -35785,18 +35857,18 @@ _tmp_193_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_193[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); + D(fprintf(stderr, "%*c> _tmp_194[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 11)) // token=':' ) { - D(fprintf(stderr, "%*c+ _tmp_193[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':'")); + D(fprintf(stderr, "%*c+ _tmp_194[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_193[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_194[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "':'")); } _res = NULL; @@ -35805,9 +35877,9 @@ _tmp_193_rule(Parser *p) return _res; } -// _loop0_195: ',' (expression ['as' star_target]) +// _loop0_196: ',' (expression ['as' star_target]) static asdl_seq * -_loop0_195_rule(Parser *p) +_loop0_196_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -35833,13 +35905,13 @@ _loop0_195_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_195[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expression ['as' star_target])")); + D(fprintf(stderr, "%*c> _loop0_196[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expression ['as' star_target])")); Token * _literal; void *elem; while ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (elem = _tmp_240_rule(p)) // expression ['as' star_target] + (elem = _tmp_242_rule(p)) // expression ['as' star_target] ) { _res = elem; @@ -35865,7 +35937,7 @@ _loop0_195_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_195[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_196[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' (expression ['as' star_target])")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -35882,9 +35954,9 @@ _loop0_195_rule(Parser *p) return _seq; } -// _gather_194: (expression ['as' star_target]) _loop0_195 +// _gather_195: (expression ['as' star_target]) _loop0_196 static asdl_seq * -_gather_194_rule(Parser *p) +_gather_195_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -35896,27 +35968,27 @@ _gather_194_rule(Parser *p) } asdl_seq * _res = NULL; int _mark = p->mark; - { // (expression ['as' star_target]) _loop0_195 + { // (expression ['as' star_target]) _loop0_196 if (p->error_indicator) { p->level--; return NULL; } - D(fprintf(stderr, "%*c> _gather_194[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expression ['as' star_target]) _loop0_195")); + D(fprintf(stderr, "%*c> _gather_195[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expression ['as' star_target]) _loop0_196")); void *elem; asdl_seq * seq; if ( - (elem = _tmp_240_rule(p)) // expression ['as' star_target] + (elem = _tmp_242_rule(p)) // expression ['as' star_target] && - (seq = _loop0_195_rule(p)) // _loop0_195 + (seq = _loop0_196_rule(p)) // _loop0_196 ) { - D(fprintf(stderr, "%*c+ _gather_194[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(expression ['as' star_target]) _loop0_195")); + D(fprintf(stderr, "%*c+ _gather_195[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(expression ['as' star_target]) _loop0_196")); _res = _PyPegen_seq_insert_in_front(p, elem, seq); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _gather_194[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(expression ['as' star_target]) _loop0_195")); + D(fprintf(stderr, "%*c%s _gather_195[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(expression ['as' star_target]) _loop0_196")); } _res = NULL; done: @@ -35924,9 +35996,9 @@ _gather_194_rule(Parser *p) return _res; } -// _loop0_197: ',' (expressions ['as' star_target]) +// _loop0_198: ',' (expressions ['as' star_target]) static asdl_seq * -_loop0_197_rule(Parser *p) +_loop0_198_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -35952,13 +36024,13 @@ _loop0_197_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_197[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expressions ['as' star_target])")); + D(fprintf(stderr, "%*c> _loop0_198[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expressions ['as' star_target])")); Token * _literal; void *elem; while ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (elem = _tmp_241_rule(p)) // expressions ['as' star_target] + (elem = _tmp_243_rule(p)) // expressions ['as' star_target] ) { _res = elem; @@ -35984,7 +36056,7 @@ _loop0_197_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_197[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_198[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' (expressions ['as' star_target])")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -36001,9 +36073,9 @@ _loop0_197_rule(Parser *p) return _seq; } -// _gather_196: (expressions ['as' star_target]) _loop0_197 +// _gather_197: (expressions ['as' star_target]) _loop0_198 static asdl_seq * -_gather_196_rule(Parser *p) +_gather_197_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36015,27 +36087,27 @@ _gather_196_rule(Parser *p) } asdl_seq * _res = NULL; int _mark = p->mark; - { // (expressions ['as' star_target]) _loop0_197 + { // (expressions ['as' star_target]) _loop0_198 if (p->error_indicator) { p->level--; return NULL; } - D(fprintf(stderr, "%*c> _gather_196[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expressions ['as' star_target]) _loop0_197")); + D(fprintf(stderr, "%*c> _gather_197[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expressions ['as' star_target]) _loop0_198")); void *elem; asdl_seq * seq; if ( - (elem = _tmp_241_rule(p)) // expressions ['as' star_target] + (elem = _tmp_243_rule(p)) // expressions ['as' star_target] && - (seq = _loop0_197_rule(p)) // _loop0_197 + (seq = _loop0_198_rule(p)) // _loop0_198 ) { - D(fprintf(stderr, "%*c+ _gather_196[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(expressions ['as' star_target]) _loop0_197")); + D(fprintf(stderr, "%*c+ _gather_197[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(expressions ['as' star_target]) _loop0_198")); _res = _PyPegen_seq_insert_in_front(p, elem, seq); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _gather_196[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(expressions ['as' star_target]) _loop0_197")); + D(fprintf(stderr, "%*c%s _gather_197[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(expressions ['as' star_target]) _loop0_198")); } _res = NULL; done: @@ -36043,9 +36115,9 @@ _gather_196_rule(Parser *p) return _res; } -// _loop0_199: ',' (expression ['as' star_target]) +// _loop0_200: ',' (expression ['as' star_target]) static asdl_seq * -_loop0_199_rule(Parser *p) +_loop0_200_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36071,13 +36143,13 @@ _loop0_199_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_199[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expression ['as' star_target])")); + D(fprintf(stderr, "%*c> _loop0_200[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expression ['as' star_target])")); Token * _literal; void *elem; while ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (elem = _tmp_242_rule(p)) // expression ['as' star_target] + (elem = _tmp_244_rule(p)) // expression ['as' star_target] ) { _res = elem; @@ -36103,7 +36175,7 @@ _loop0_199_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_199[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_200[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' (expression ['as' star_target])")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -36120,9 +36192,9 @@ _loop0_199_rule(Parser *p) return _seq; } -// _gather_198: (expression ['as' star_target]) _loop0_199 +// _gather_199: (expression ['as' star_target]) _loop0_200 static asdl_seq * -_gather_198_rule(Parser *p) +_gather_199_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36134,27 +36206,27 @@ _gather_198_rule(Parser *p) } asdl_seq * _res = NULL; int _mark = p->mark; - { // (expression ['as' star_target]) _loop0_199 + { // (expression ['as' star_target]) _loop0_200 if (p->error_indicator) { p->level--; return NULL; } - D(fprintf(stderr, "%*c> _gather_198[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expression ['as' star_target]) _loop0_199")); + D(fprintf(stderr, "%*c> _gather_199[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expression ['as' star_target]) _loop0_200")); void *elem; asdl_seq * seq; if ( - (elem = _tmp_242_rule(p)) // expression ['as' star_target] + (elem = _tmp_244_rule(p)) // expression ['as' star_target] && - (seq = _loop0_199_rule(p)) // _loop0_199 + (seq = _loop0_200_rule(p)) // _loop0_200 ) { - D(fprintf(stderr, "%*c+ _gather_198[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(expression ['as' star_target]) _loop0_199")); + D(fprintf(stderr, "%*c+ _gather_199[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(expression ['as' star_target]) _loop0_200")); _res = _PyPegen_seq_insert_in_front(p, elem, seq); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _gather_198[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(expression ['as' star_target]) _loop0_199")); + D(fprintf(stderr, "%*c%s _gather_199[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(expression ['as' star_target]) _loop0_200")); } _res = NULL; done: @@ -36162,9 +36234,9 @@ _gather_198_rule(Parser *p) return _res; } -// _loop0_201: ',' (expressions ['as' star_target]) +// _loop0_202: ',' (expressions ['as' star_target]) static asdl_seq * -_loop0_201_rule(Parser *p) +_loop0_202_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36190,13 +36262,13 @@ _loop0_201_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_201[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expressions ['as' star_target])")); + D(fprintf(stderr, "%*c> _loop0_202[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (expressions ['as' star_target])")); Token * _literal; void *elem; while ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' && - (elem = _tmp_243_rule(p)) // expressions ['as' star_target] + (elem = _tmp_245_rule(p)) // expressions ['as' star_target] ) { _res = elem; @@ -36222,7 +36294,7 @@ _loop0_201_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_201[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_202[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' (expressions ['as' star_target])")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -36239,9 +36311,9 @@ _loop0_201_rule(Parser *p) return _seq; } -// _gather_200: (expressions ['as' star_target]) _loop0_201 +// _gather_201: (expressions ['as' star_target]) _loop0_202 static asdl_seq * -_gather_200_rule(Parser *p) +_gather_201_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36253,27 +36325,27 @@ _gather_200_rule(Parser *p) } asdl_seq * _res = NULL; int _mark = p->mark; - { // (expressions ['as' star_target]) _loop0_201 + { // (expressions ['as' star_target]) _loop0_202 if (p->error_indicator) { p->level--; return NULL; } - D(fprintf(stderr, "%*c> _gather_200[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expressions ['as' star_target]) _loop0_201")); + D(fprintf(stderr, "%*c> _gather_201[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(expressions ['as' star_target]) _loop0_202")); void *elem; asdl_seq * seq; if ( - (elem = _tmp_243_rule(p)) // expressions ['as' star_target] + (elem = _tmp_245_rule(p)) // expressions ['as' star_target] && - (seq = _loop0_201_rule(p)) // _loop0_201 + (seq = _loop0_202_rule(p)) // _loop0_202 ) { - D(fprintf(stderr, "%*c+ _gather_200[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(expressions ['as' star_target]) _loop0_201")); + D(fprintf(stderr, "%*c+ _gather_201[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(expressions ['as' star_target]) _loop0_202")); _res = _PyPegen_seq_insert_in_front(p, elem, seq); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _gather_200[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(expressions ['as' star_target]) _loop0_201")); + D(fprintf(stderr, "%*c%s _gather_201[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(expressions ['as' star_target]) _loop0_202")); } _res = NULL; done: @@ -36281,9 +36353,9 @@ _gather_200_rule(Parser *p) return _res; } -// _tmp_202: 'except' | 'finally' +// _tmp_203: 'except' | 'finally' static void * -_tmp_202_rule(Parser *p) +_tmp_203_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36300,18 +36372,18 @@ _tmp_202_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_202[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except'")); + D(fprintf(stderr, "%*c> _tmp_203[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'except'")); Token * _keyword; if ( (_keyword = _PyPegen_expect_token(p, 634)) // token='except' ) { - D(fprintf(stderr, "%*c+ _tmp_202[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'except'")); + D(fprintf(stderr, "%*c+ _tmp_203[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'except'")); _res = _keyword; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_202[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_203[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'except'")); } { // 'finally' @@ -36319,18 +36391,18 @@ _tmp_202_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_202[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'finally'")); + D(fprintf(stderr, "%*c> _tmp_203[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'finally'")); Token * _keyword; if ( (_keyword = _PyPegen_expect_token(p, 630)) // token='finally' ) { - D(fprintf(stderr, "%*c+ _tmp_202[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'finally'")); + D(fprintf(stderr, "%*c+ _tmp_203[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'finally'")); _res = _keyword; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_202[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_203[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'finally'")); } _res = NULL; @@ -36339,9 +36411,9 @@ _tmp_202_rule(Parser *p) return _res; } -// _loop0_203: block +// _loop0_204: block static asdl_seq * -_loop0_203_rule(Parser *p) +_loop0_204_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36367,7 +36439,7 @@ _loop0_203_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_203[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "block")); + D(fprintf(stderr, "%*c> _loop0_204[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "block")); asdl_stmt_seq* block_var; while ( (block_var = block_rule(p)) // block @@ -36390,7 +36462,7 @@ _loop0_203_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_203[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_204[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "block")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -36407,9 +36479,9 @@ _loop0_203_rule(Parser *p) return _seq; } -// _loop1_204: except_block +// _loop1_205: except_block static asdl_seq * -_loop1_204_rule(Parser *p) +_loop1_205_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36435,7 +36507,7 @@ _loop1_204_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop1_204[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_block")); + D(fprintf(stderr, "%*c> _loop1_205[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_block")); excepthandler_ty except_block_var; while ( (except_block_var = except_block_rule(p)) // except_block @@ -36458,7 +36530,7 @@ _loop1_204_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop1_204[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop1_205[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "except_block")); } if (_n == 0 || p->error_indicator) { @@ -36480,9 +36552,9 @@ _loop1_204_rule(Parser *p) return _seq; } -// _tmp_205: 'as' NAME +// _tmp_206: 'as' NAME static void * -_tmp_205_rule(Parser *p) +_tmp_206_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36499,7 +36571,7 @@ _tmp_205_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_205[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); + D(fprintf(stderr, "%*c> _tmp_206[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); Token * _keyword; expr_ty name_var; if ( @@ -36508,12 +36580,12 @@ _tmp_205_rule(Parser *p) (name_var = _PyPegen_name_token(p)) // NAME ) { - D(fprintf(stderr, "%*c+ _tmp_205[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' NAME")); + D(fprintf(stderr, "%*c+ _tmp_206[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' NAME")); _res = _PyPegen_dummy_name(p, _keyword, name_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_205[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_206[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'as' NAME")); } _res = NULL; @@ -36522,9 +36594,9 @@ _tmp_205_rule(Parser *p) return _res; } -// _loop0_206: block +// _loop0_207: block static asdl_seq * -_loop0_206_rule(Parser *p) +_loop0_207_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36550,7 +36622,7 @@ _loop0_206_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_206[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "block")); + D(fprintf(stderr, "%*c> _loop0_207[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "block")); asdl_stmt_seq* block_var; while ( (block_var = block_rule(p)) // block @@ -36573,7 +36645,7 @@ _loop0_206_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_206[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_207[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "block")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -36590,9 +36662,9 @@ _loop0_206_rule(Parser *p) return _seq; } -// _loop1_207: except_star_block +// _loop1_208: except_star_block static asdl_seq * -_loop1_207_rule(Parser *p) +_loop1_208_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36618,7 +36690,7 @@ _loop1_207_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop1_207[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_star_block")); + D(fprintf(stderr, "%*c> _loop1_208[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "except_star_block")); excepthandler_ty except_star_block_var; while ( (except_star_block_var = except_star_block_rule(p)) // except_star_block @@ -36641,7 +36713,7 @@ _loop1_207_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop1_207[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop1_208[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "except_star_block")); } if (_n == 0 || p->error_indicator) { @@ -36663,9 +36735,9 @@ _loop1_207_rule(Parser *p) return _seq; } -// _tmp_208: expression ['as' NAME] +// _tmp_209: expression ['as' NAME] static void * -_tmp_208_rule(Parser *p) +_tmp_209_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36682,22 +36754,22 @@ _tmp_208_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_208[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ['as' NAME]")); + D(fprintf(stderr, "%*c> _tmp_209[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ['as' NAME]")); void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings expr_ty expression_var; if ( (expression_var = expression_rule(p)) // expression && - (_opt_var = _tmp_244_rule(p), !p->error_indicator) // ['as' NAME] + (_opt_var = _tmp_246_rule(p), !p->error_indicator) // ['as' NAME] ) { - D(fprintf(stderr, "%*c+ _tmp_208[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ['as' NAME]")); + D(fprintf(stderr, "%*c+ _tmp_209[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ['as' NAME]")); _res = _PyPegen_dummy_name(p, expression_var, _opt_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_208[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_209[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "expression ['as' NAME]")); } _res = NULL; @@ -36706,9 +36778,9 @@ _tmp_208_rule(Parser *p) return _res; } -// _tmp_209: 'as' NAME +// _tmp_210: 'as' NAME static void * -_tmp_209_rule(Parser *p) +_tmp_210_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36725,7 +36797,7 @@ _tmp_209_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_209[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); + D(fprintf(stderr, "%*c> _tmp_210[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); Token * _keyword; expr_ty name_var; if ( @@ -36734,12 +36806,12 @@ _tmp_209_rule(Parser *p) (name_var = _PyPegen_name_token(p)) // NAME ) { - D(fprintf(stderr, "%*c+ _tmp_209[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' NAME")); + D(fprintf(stderr, "%*c+ _tmp_210[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' NAME")); _res = _PyPegen_dummy_name(p, _keyword, name_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_209[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_210[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'as' NAME")); } _res = NULL; @@ -36748,9 +36820,9 @@ _tmp_209_rule(Parser *p) return _res; } -// _tmp_210: 'as' NAME +// _tmp_211: 'as' NAME static void * -_tmp_210_rule(Parser *p) +_tmp_211_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36767,7 +36839,7 @@ _tmp_210_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_210[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); + D(fprintf(stderr, "%*c> _tmp_211[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); Token * _keyword; expr_ty name_var; if ( @@ -36776,12 +36848,12 @@ _tmp_210_rule(Parser *p) (name_var = _PyPegen_name_token(p)) // NAME ) { - D(fprintf(stderr, "%*c+ _tmp_210[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' NAME")); + D(fprintf(stderr, "%*c+ _tmp_211[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' NAME")); _res = _PyPegen_dummy_name(p, _keyword, name_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_210[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_211[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'as' NAME")); } _res = NULL; @@ -36790,9 +36862,9 @@ _tmp_210_rule(Parser *p) return _res; } -// _tmp_211: NEWLINE | ':' +// _tmp_212: NEWLINE | ':' static void * -_tmp_211_rule(Parser *p) +_tmp_212_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36809,18 +36881,18 @@ _tmp_211_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_211[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE")); + D(fprintf(stderr, "%*c> _tmp_212[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "NEWLINE")); Token * newline_var; if ( (newline_var = _PyPegen_expect_token(p, NEWLINE)) // token='NEWLINE' ) { - D(fprintf(stderr, "%*c+ _tmp_211[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NEWLINE")); + D(fprintf(stderr, "%*c+ _tmp_212[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "NEWLINE")); _res = newline_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_211[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_212[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "NEWLINE")); } { // ':' @@ -36828,18 +36900,18 @@ _tmp_211_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_211[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); + D(fprintf(stderr, "%*c> _tmp_212[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 11)) // token=':' ) { - D(fprintf(stderr, "%*c+ _tmp_211[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':'")); + D(fprintf(stderr, "%*c+ _tmp_212[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_211[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_212[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "':'")); } _res = NULL; @@ -36848,9 +36920,9 @@ _tmp_211_rule(Parser *p) return _res; } -// _tmp_212: 'as' NAME +// _tmp_213: 'as' NAME static void * -_tmp_212_rule(Parser *p) +_tmp_213_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36867,7 +36939,7 @@ _tmp_212_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_212[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); + D(fprintf(stderr, "%*c> _tmp_213[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); Token * _keyword; expr_ty name_var; if ( @@ -36876,12 +36948,12 @@ _tmp_212_rule(Parser *p) (name_var = _PyPegen_name_token(p)) // NAME ) { - D(fprintf(stderr, "%*c+ _tmp_212[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' NAME")); + D(fprintf(stderr, "%*c+ _tmp_213[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' NAME")); _res = _PyPegen_dummy_name(p, _keyword, name_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_212[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_213[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'as' NAME")); } _res = NULL; @@ -36890,9 +36962,9 @@ _tmp_212_rule(Parser *p) return _res; } -// _tmp_213: 'as' NAME +// _tmp_214: 'as' NAME static void * -_tmp_213_rule(Parser *p) +_tmp_214_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36909,7 +36981,7 @@ _tmp_213_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_213[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); + D(fprintf(stderr, "%*c> _tmp_214[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); Token * _keyword; expr_ty name_var; if ( @@ -36918,12 +36990,12 @@ _tmp_213_rule(Parser *p) (name_var = _PyPegen_name_token(p)) // NAME ) { - D(fprintf(stderr, "%*c+ _tmp_213[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' NAME")); + D(fprintf(stderr, "%*c+ _tmp_214[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' NAME")); _res = _PyPegen_dummy_name(p, _keyword, name_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_213[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_214[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'as' NAME")); } _res = NULL; @@ -36932,9 +37004,9 @@ _tmp_213_rule(Parser *p) return _res; } -// _tmp_214: positional_patterns ',' +// _tmp_215: positional_patterns ',' static void * -_tmp_214_rule(Parser *p) +_tmp_215_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36951,7 +37023,7 @@ _tmp_214_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_214[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "positional_patterns ','")); + D(fprintf(stderr, "%*c> _tmp_215[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "positional_patterns ','")); Token * _literal; asdl_pattern_seq* positional_patterns_var; if ( @@ -36960,12 +37032,12 @@ _tmp_214_rule(Parser *p) (_literal = _PyPegen_expect_token(p, 12)) // token=',' ) { - D(fprintf(stderr, "%*c+ _tmp_214[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "positional_patterns ','")); + D(fprintf(stderr, "%*c+ _tmp_215[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "positional_patterns ','")); _res = _PyPegen_dummy_name(p, positional_patterns_var, _literal); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_214[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_215[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "positional_patterns ','")); } _res = NULL; @@ -36974,9 +37046,9 @@ _tmp_214_rule(Parser *p) return _res; } -// _tmp_215: '->' expression +// _tmp_216: '->' expression static void * -_tmp_215_rule(Parser *p) +_tmp_216_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -36993,7 +37065,7 @@ _tmp_215_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_215[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'->' expression")); + D(fprintf(stderr, "%*c> _tmp_216[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'->' expression")); Token * _literal; expr_ty expression_var; if ( @@ -37002,12 +37074,12 @@ _tmp_215_rule(Parser *p) (expression_var = expression_rule(p)) // expression ) { - D(fprintf(stderr, "%*c+ _tmp_215[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'->' expression")); + D(fprintf(stderr, "%*c+ _tmp_216[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'->' expression")); _res = _PyPegen_dummy_name(p, _literal, expression_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_215[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_216[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'->' expression")); } _res = NULL; @@ -37016,9 +37088,9 @@ _tmp_215_rule(Parser *p) return _res; } -// _tmp_216: '(' arguments? ')' +// _tmp_217: '(' arguments? ')' static void * -_tmp_216_rule(Parser *p) +_tmp_217_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37035,7 +37107,7 @@ _tmp_216_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_216[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' arguments? ')'")); + D(fprintf(stderr, "%*c> _tmp_217[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' arguments? ')'")); Token * _literal; Token * _literal_1; void *_opt_var; @@ -37048,12 +37120,12 @@ _tmp_216_rule(Parser *p) (_literal_1 = _PyPegen_expect_token(p, 8)) // token=')' ) { - D(fprintf(stderr, "%*c+ _tmp_216[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'(' arguments? ')'")); + D(fprintf(stderr, "%*c+ _tmp_217[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'(' arguments? ')'")); _res = _PyPegen_dummy_name(p, _literal, _opt_var, _literal_1); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_216[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_217[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'(' arguments? ')'")); } _res = NULL; @@ -37062,9 +37134,9 @@ _tmp_216_rule(Parser *p) return _res; } -// _tmp_217: '(' arguments? ')' +// _tmp_218: '(' arguments? ')' static void * -_tmp_217_rule(Parser *p) +_tmp_218_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37081,7 +37153,7 @@ _tmp_217_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_217[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' arguments? ')'")); + D(fprintf(stderr, "%*c> _tmp_218[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'(' arguments? ')'")); Token * _literal; Token * _literal_1; void *_opt_var; @@ -37094,12 +37166,12 @@ _tmp_217_rule(Parser *p) (_literal_1 = _PyPegen_expect_token(p, 8)) // token=')' ) { - D(fprintf(stderr, "%*c+ _tmp_217[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'(' arguments? ')'")); + D(fprintf(stderr, "%*c+ _tmp_218[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'(' arguments? ')'")); _res = _PyPegen_dummy_name(p, _literal, _opt_var, _literal_1); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_217[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_218[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'(' arguments? ')'")); } _res = NULL; @@ -37108,9 +37180,9 @@ _tmp_217_rule(Parser *p) return _res; } -// _loop0_219: ',' double_starred_kvpair +// _loop0_220: ',' double_starred_kvpair static asdl_seq * -_loop0_219_rule(Parser *p) +_loop0_220_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37136,7 +37208,7 @@ _loop0_219_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _loop0_219[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' double_starred_kvpair")); + D(fprintf(stderr, "%*c> _loop0_220[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' double_starred_kvpair")); Token * _literal; KeyValuePair* elem; while ( @@ -37168,7 +37240,7 @@ _loop0_219_rule(Parser *p) _mark = p->mark; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _loop0_219[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _loop0_220[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' double_starred_kvpair")); } asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); @@ -37185,9 +37257,9 @@ _loop0_219_rule(Parser *p) return _seq; } -// _gather_218: double_starred_kvpair _loop0_219 +// _gather_219: double_starred_kvpair _loop0_220 static asdl_seq * -_gather_218_rule(Parser *p) +_gather_219_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37199,27 +37271,27 @@ _gather_218_rule(Parser *p) } asdl_seq * _res = NULL; int _mark = p->mark; - { // double_starred_kvpair _loop0_219 + { // double_starred_kvpair _loop0_220 if (p->error_indicator) { p->level--; return NULL; } - D(fprintf(stderr, "%*c> _gather_218[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "double_starred_kvpair _loop0_219")); + D(fprintf(stderr, "%*c> _gather_219[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "double_starred_kvpair _loop0_220")); KeyValuePair* elem; asdl_seq * seq; if ( (elem = double_starred_kvpair_rule(p)) // double_starred_kvpair && - (seq = _loop0_219_rule(p)) // _loop0_219 + (seq = _loop0_220_rule(p)) // _loop0_220 ) { - D(fprintf(stderr, "%*c+ _gather_218[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "double_starred_kvpair _loop0_219")); + D(fprintf(stderr, "%*c+ _gather_219[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "double_starred_kvpair _loop0_220")); _res = _PyPegen_seq_insert_in_front(p, elem, seq); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _gather_218[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "double_starred_kvpair _loop0_219")); + D(fprintf(stderr, "%*c%s _gather_219[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "double_starred_kvpair _loop0_220")); } _res = NULL; done: @@ -37227,9 +37299,9 @@ _gather_218_rule(Parser *p) return _res; } -// _tmp_220: '}' | ',' +// _tmp_221: '}' | ',' static void * -_tmp_220_rule(Parser *p) +_tmp_221_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37246,18 +37318,18 @@ _tmp_220_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_220[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'}'")); + D(fprintf(stderr, "%*c> _tmp_221[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'}'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 26)) // token='}' ) { - D(fprintf(stderr, "%*c+ _tmp_220[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'}'")); + D(fprintf(stderr, "%*c+ _tmp_221[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'}'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_220[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_221[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'}'")); } { // ',' @@ -37265,18 +37337,18 @@ _tmp_220_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_220[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c> _tmp_221[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' ) { - D(fprintf(stderr, "%*c+ _tmp_220[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c+ _tmp_221[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_220[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_221[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "','")); } _res = NULL; @@ -37285,9 +37357,9 @@ _tmp_220_rule(Parser *p) return _res; } -// _tmp_221: '}' | ',' +// _tmp_222: '}' | ',' static void * -_tmp_221_rule(Parser *p) +_tmp_222_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37304,18 +37376,18 @@ _tmp_221_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_221[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'}'")); + D(fprintf(stderr, "%*c> _tmp_222[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'}'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 26)) // token='}' ) { - D(fprintf(stderr, "%*c+ _tmp_221[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'}'")); + D(fprintf(stderr, "%*c+ _tmp_222[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'}'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_221[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_222[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'}'")); } { // ',' @@ -37323,18 +37395,18 @@ _tmp_221_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_221[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c> _tmp_222[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 12)) // token=',' ) { - D(fprintf(stderr, "%*c+ _tmp_221[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); + D(fprintf(stderr, "%*c+ _tmp_222[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_221[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_222[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "','")); } _res = NULL; @@ -37343,9 +37415,9 @@ _tmp_221_rule(Parser *p) return _res; } -// _tmp_222: star_targets '=' +// _tmp_223: star_targets '=' static void * -_tmp_222_rule(Parser *p) +_tmp_223_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37362,7 +37434,7 @@ _tmp_222_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_222[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); + D(fprintf(stderr, "%*c> _tmp_223[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); Token * _literal; expr_ty z; if ( @@ -37371,7 +37443,7 @@ _tmp_222_rule(Parser *p) (_literal = _PyPegen_expect_token(p, 22)) // token='=' ) { - D(fprintf(stderr, "%*c+ _tmp_222[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_targets '='")); + D(fprintf(stderr, "%*c+ _tmp_223[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_targets '='")); _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37381,7 +37453,7 @@ _tmp_222_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_222[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_223[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "star_targets '='")); } _res = NULL; @@ -37390,9 +37462,9 @@ _tmp_222_rule(Parser *p) return _res; } -// _tmp_223: '.' | '...' +// _tmp_224: '.' | '...' static void * -_tmp_223_rule(Parser *p) +_tmp_224_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37409,18 +37481,18 @@ _tmp_223_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_223[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); + D(fprintf(stderr, "%*c> _tmp_224[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 23)) // token='.' ) { - D(fprintf(stderr, "%*c+ _tmp_223[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'.'")); + D(fprintf(stderr, "%*c+ _tmp_224[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'.'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_223[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_224[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'.'")); } { // '...' @@ -37428,18 +37500,18 @@ _tmp_223_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_223[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'...'")); + D(fprintf(stderr, "%*c> _tmp_224[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'...'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 52)) // token='...' ) { - D(fprintf(stderr, "%*c+ _tmp_223[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'...'")); + D(fprintf(stderr, "%*c+ _tmp_224[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'...'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_223[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_224[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'...'")); } _res = NULL; @@ -37448,9 +37520,9 @@ _tmp_223_rule(Parser *p) return _res; } -// _tmp_224: '.' | '...' +// _tmp_225: '.' | '...' static void * -_tmp_224_rule(Parser *p) +_tmp_225_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37467,18 +37539,18 @@ _tmp_224_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_224[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); + D(fprintf(stderr, "%*c> _tmp_225[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'.'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 23)) // token='.' ) { - D(fprintf(stderr, "%*c+ _tmp_224[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'.'")); + D(fprintf(stderr, "%*c+ _tmp_225[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'.'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_224[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_225[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'.'")); } { // '...' @@ -37486,18 +37558,18 @@ _tmp_224_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_224[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'...'")); + D(fprintf(stderr, "%*c> _tmp_225[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'...'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 52)) // token='...' ) { - D(fprintf(stderr, "%*c+ _tmp_224[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'...'")); + D(fprintf(stderr, "%*c+ _tmp_225[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'...'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_224[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_225[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'...'")); } _res = NULL; @@ -37506,9 +37578,9 @@ _tmp_224_rule(Parser *p) return _res; } -// _tmp_225: '@' named_expression NEWLINE +// _tmp_226: '@' named_expression NEWLINE static void * -_tmp_225_rule(Parser *p) +_tmp_226_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37525,7 +37597,7 @@ _tmp_225_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_225[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'@' named_expression NEWLINE")); + D(fprintf(stderr, "%*c> _tmp_226[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'@' named_expression NEWLINE")); Token * _literal; expr_ty f; Token * newline_var; @@ -37537,7 +37609,7 @@ _tmp_225_rule(Parser *p) (newline_var = _PyPegen_expect_token(p, NEWLINE)) // token='NEWLINE' ) { - D(fprintf(stderr, "%*c+ _tmp_225[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'@' named_expression NEWLINE")); + D(fprintf(stderr, "%*c+ _tmp_226[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'@' named_expression NEWLINE")); _res = f; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37547,7 +37619,7 @@ _tmp_225_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_225[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_226[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'@' named_expression NEWLINE")); } _res = NULL; @@ -37556,9 +37628,9 @@ _tmp_225_rule(Parser *p) return _res; } -// _tmp_226: ',' expression +// _tmp_227: ',' expression static void * -_tmp_226_rule(Parser *p) +_tmp_227_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37575,7 +37647,7 @@ _tmp_226_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_226[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); + D(fprintf(stderr, "%*c> _tmp_227[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' expression")); Token * _literal; expr_ty c; if ( @@ -37584,7 +37656,7 @@ _tmp_226_rule(Parser *p) (c = expression_rule(p)) // expression ) { - D(fprintf(stderr, "%*c+ _tmp_226[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' expression")); + D(fprintf(stderr, "%*c+ _tmp_227[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' expression")); _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37594,7 +37666,7 @@ _tmp_226_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_226[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_227[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' expression")); } _res = NULL; @@ -37603,9 +37675,9 @@ _tmp_226_rule(Parser *p) return _res; } -// _tmp_227: ',' star_expression +// _tmp_228: ',' star_expression static void * -_tmp_227_rule(Parser *p) +_tmp_228_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37622,7 +37694,7 @@ _tmp_227_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_227[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_expression")); + D(fprintf(stderr, "%*c> _tmp_228[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_expression")); Token * _literal; expr_ty c; if ( @@ -37631,7 +37703,7 @@ _tmp_227_rule(Parser *p) (c = star_expression_rule(p)) // star_expression ) { - D(fprintf(stderr, "%*c+ _tmp_227[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' star_expression")); + D(fprintf(stderr, "%*c+ _tmp_228[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' star_expression")); _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37641,7 +37713,7 @@ _tmp_227_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_227[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_228[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' star_expression")); } _res = NULL; @@ -37650,9 +37722,9 @@ _tmp_227_rule(Parser *p) return _res; } -// _tmp_228: 'or' conjunction +// _tmp_229: 'or' conjunction static void * -_tmp_228_rule(Parser *p) +_tmp_229_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37669,7 +37741,7 @@ _tmp_228_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_228[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'or' conjunction")); + D(fprintf(stderr, "%*c> _tmp_229[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'or' conjunction")); Token * _keyword; expr_ty c; if ( @@ -37678,7 +37750,7 @@ _tmp_228_rule(Parser *p) (c = conjunction_rule(p)) // conjunction ) { - D(fprintf(stderr, "%*c+ _tmp_228[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'or' conjunction")); + D(fprintf(stderr, "%*c+ _tmp_229[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'or' conjunction")); _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37688,7 +37760,7 @@ _tmp_228_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_228[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_229[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'or' conjunction")); } _res = NULL; @@ -37697,9 +37769,9 @@ _tmp_228_rule(Parser *p) return _res; } -// _tmp_229: 'and' inversion +// _tmp_230: 'and' inversion static void * -_tmp_229_rule(Parser *p) +_tmp_230_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37716,7 +37788,7 @@ _tmp_229_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_229[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'and' inversion")); + D(fprintf(stderr, "%*c> _tmp_230[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'and' inversion")); Token * _keyword; expr_ty c; if ( @@ -37725,7 +37797,7 @@ _tmp_229_rule(Parser *p) (c = inversion_rule(p)) // inversion ) { - D(fprintf(stderr, "%*c+ _tmp_229[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'and' inversion")); + D(fprintf(stderr, "%*c+ _tmp_230[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'and' inversion")); _res = c; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37735,7 +37807,7 @@ _tmp_229_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_229[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_230[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'and' inversion")); } _res = NULL; @@ -37744,9 +37816,9 @@ _tmp_229_rule(Parser *p) return _res; } -// _tmp_230: slice | starred_expression +// _tmp_231: slice | starred_expression static void * -_tmp_230_rule(Parser *p) +_tmp_231_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37763,18 +37835,18 @@ _tmp_230_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_230[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slice")); + D(fprintf(stderr, "%*c> _tmp_231[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "slice")); expr_ty slice_var; if ( (slice_var = slice_rule(p)) // slice ) { - D(fprintf(stderr, "%*c+ _tmp_230[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "slice")); + D(fprintf(stderr, "%*c+ _tmp_231[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "slice")); _res = slice_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_230[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_231[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "slice")); } { // starred_expression @@ -37782,18 +37854,18 @@ _tmp_230_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_230[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "starred_expression")); + D(fprintf(stderr, "%*c> _tmp_231[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "starred_expression")); expr_ty starred_expression_var; if ( (starred_expression_var = starred_expression_rule(p)) // starred_expression ) { - D(fprintf(stderr, "%*c+ _tmp_230[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "starred_expression")); + D(fprintf(stderr, "%*c+ _tmp_231[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "starred_expression")); _res = starred_expression_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_230[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_231[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "starred_expression")); } _res = NULL; @@ -37802,9 +37874,9 @@ _tmp_230_rule(Parser *p) return _res; } -// _tmp_231: 'if' disjunction +// _tmp_232: 'if' disjunction static void * -_tmp_231_rule(Parser *p) +_tmp_232_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37821,7 +37893,7 @@ _tmp_231_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_231[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); + D(fprintf(stderr, "%*c> _tmp_232[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); Token * _keyword; expr_ty z; if ( @@ -37830,7 +37902,7 @@ _tmp_231_rule(Parser *p) (z = disjunction_rule(p)) // disjunction ) { - D(fprintf(stderr, "%*c+ _tmp_231[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); + D(fprintf(stderr, "%*c+ _tmp_232[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37840,7 +37912,7 @@ _tmp_231_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_231[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_232[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'if' disjunction")); } _res = NULL; @@ -37849,9 +37921,9 @@ _tmp_231_rule(Parser *p) return _res; } -// _tmp_232: 'if' disjunction +// _tmp_233: 'if' disjunction static void * -_tmp_232_rule(Parser *p) +_tmp_233_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37868,7 +37940,7 @@ _tmp_232_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_232[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); + D(fprintf(stderr, "%*c> _tmp_233[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); Token * _keyword; expr_ty z; if ( @@ -37877,7 +37949,7 @@ _tmp_232_rule(Parser *p) (z = disjunction_rule(p)) // disjunction ) { - D(fprintf(stderr, "%*c+ _tmp_232[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); + D(fprintf(stderr, "%*c+ _tmp_233[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'if' disjunction")); _res = z; if (_res == NULL && PyErr_Occurred()) { p->error_indicator = 1; @@ -37887,7 +37959,7 @@ _tmp_232_rule(Parser *p) goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_232[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_233[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'if' disjunction")); } _res = NULL; @@ -37896,9 +37968,9 @@ _tmp_232_rule(Parser *p) return _res; } -// _tmp_233: starred_expression | (assignment_expression | expression !':=') !'=' +// _tmp_234: starred_expression | (assignment_expression | expression !':=') !'=' static void * -_tmp_233_rule(Parser *p) +_tmp_234_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -37915,18 +37987,18 @@ _tmp_233_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_233[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "starred_expression")); + D(fprintf(stderr, "%*c> _tmp_234[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "starred_expression")); expr_ty starred_expression_var; if ( (starred_expression_var = starred_expression_rule(p)) // starred_expression ) { - D(fprintf(stderr, "%*c+ _tmp_233[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "starred_expression")); + D(fprintf(stderr, "%*c+ _tmp_234[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "starred_expression")); _res = starred_expression_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_233[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_234[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "starred_expression")); } { // (assignment_expression | expression !':=') !'=' @@ -37934,68 +38006,21 @@ _tmp_233_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_233[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(assignment_expression | expression !':=') !'='")); - void *_tmp_245_var; + D(fprintf(stderr, "%*c> _tmp_234[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(assignment_expression | expression !':=') !'='")); + void *_tmp_247_var; if ( - (_tmp_245_var = _tmp_245_rule(p)) // assignment_expression | expression !':=' + (_tmp_247_var = _tmp_247_rule(p)) // assignment_expression | expression !':=' && _PyPegen_lookahead_with_int(0, _PyPegen_expect_token, p, 22) // token='=' ) { - D(fprintf(stderr, "%*c+ _tmp_233[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(assignment_expression | expression !':=') !'='")); - _res = _tmp_245_var; - goto done; - } - p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_233[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(assignment_expression | expression !':=') !'='")); - } - _res = NULL; - done: - p->level--; - return _res; -} - -// _tmp_234: ',' star_target -static void * -_tmp_234_rule(Parser *p) -{ - if (p->level++ == MAXSTACK) { - p->error_indicator = 1; - PyErr_NoMemory(); - } - if (p->error_indicator) { - p->level--; - return NULL; - } - void * _res = NULL; - int _mark = p->mark; - { // ',' star_target - if (p->error_indicator) { - p->level--; - return NULL; - } - D(fprintf(stderr, "%*c> _tmp_234[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_target")); - Token * _literal; - expr_ty c; - if ( - (_literal = _PyPegen_expect_token(p, 12)) // token=',' - && - (c = star_target_rule(p)) // star_target - ) - { - D(fprintf(stderr, "%*c+ _tmp_234[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' star_target")); - _res = c; - if (_res == NULL && PyErr_Occurred()) { - p->error_indicator = 1; - p->level--; - return NULL; - } + D(fprintf(stderr, "%*c+ _tmp_234[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(assignment_expression | expression !':=') !'='")); + _res = _tmp_247_var; goto done; } p->mark = _mark; D(fprintf(stderr, "%*c%s _tmp_234[%d-%d]: %s failed!\n", p->level, ' ', - p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' star_target")); + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(assignment_expression | expression !':=') !'='")); } _res = NULL; done: @@ -38050,9 +38075,102 @@ _tmp_235_rule(Parser *p) return _res; } -// _tmp_236: star_targets '=' +// _tmp_236: ',' star_target static void * _tmp_236_rule(Parser *p) +{ + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } + if (p->error_indicator) { + p->level--; + return NULL; + } + void * _res = NULL; + int _mark = p->mark; + { // ',' star_target + if (p->error_indicator) { + p->level--; + return NULL; + } + D(fprintf(stderr, "%*c> _tmp_236[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' star_target")); + Token * _literal; + expr_ty c; + if ( + (_literal = _PyPegen_expect_token(p, 12)) // token=',' + && + (c = star_target_rule(p)) // star_target + ) + { + D(fprintf(stderr, "%*c+ _tmp_236[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "',' star_target")); + _res = c; + if (_res == NULL && PyErr_Occurred()) { + p->error_indicator = 1; + p->level--; + return NULL; + } + goto done; + } + p->mark = _mark; + D(fprintf(stderr, "%*c%s _tmp_236[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' star_target")); + } + _res = NULL; + done: + p->level--; + return _res; +} + +// _tmp_237: +// | ','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs +static void * +_tmp_237_rule(Parser *p) +{ + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } + if (p->error_indicator) { + p->level--; + return NULL; + } + void * _res = NULL; + int _mark = p->mark; + { // ','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs + if (p->error_indicator) { + p->level--; + return NULL; + } + D(fprintf(stderr, "%*c> _tmp_237[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs")); + asdl_seq * _gather_248_var; + Token * _literal; + asdl_seq* kwargs_var; + if ( + (_gather_248_var = _gather_248_rule(p)) // ','.(starred_expression | (assignment_expression | expression !':=') !'=')+ + && + (_literal = _PyPegen_expect_token(p, 12)) // token=',' + && + (kwargs_var = kwargs_rule(p)) // kwargs + ) + { + D(fprintf(stderr, "%*c+ _tmp_237[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs")); + _res = _PyPegen_dummy_name(p, _gather_248_var, _literal, kwargs_var); + goto done; + } + p->mark = _mark; + D(fprintf(stderr, "%*c%s _tmp_237[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "','.(starred_expression | (assignment_expression | expression !':=') !'=')+ ',' kwargs")); + } + _res = NULL; + done: + p->level--; + return _res; +} + +// _tmp_238: star_targets '=' +static void * +_tmp_238_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38069,7 +38187,7 @@ _tmp_236_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_236[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); + D(fprintf(stderr, "%*c> _tmp_238[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); Token * _literal; expr_ty star_targets_var; if ( @@ -38078,12 +38196,12 @@ _tmp_236_rule(Parser *p) (_literal = _PyPegen_expect_token(p, 22)) // token='=' ) { - D(fprintf(stderr, "%*c+ _tmp_236[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_targets '='")); + D(fprintf(stderr, "%*c+ _tmp_238[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_targets '='")); _res = _PyPegen_dummy_name(p, star_targets_var, _literal); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_236[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_238[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "star_targets '='")); } _res = NULL; @@ -38092,9 +38210,9 @@ _tmp_236_rule(Parser *p) return _res; } -// _tmp_237: star_targets '=' +// _tmp_239: star_targets '=' static void * -_tmp_237_rule(Parser *p) +_tmp_239_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38111,7 +38229,7 @@ _tmp_237_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_237[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); + D(fprintf(stderr, "%*c> _tmp_239[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "star_targets '='")); Token * _literal; expr_ty star_targets_var; if ( @@ -38120,12 +38238,12 @@ _tmp_237_rule(Parser *p) (_literal = _PyPegen_expect_token(p, 22)) // token='=' ) { - D(fprintf(stderr, "%*c+ _tmp_237[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_targets '='")); + D(fprintf(stderr, "%*c+ _tmp_239[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "star_targets '='")); _res = _PyPegen_dummy_name(p, star_targets_var, _literal); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_237[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_239[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "star_targets '='")); } _res = NULL; @@ -38134,9 +38252,9 @@ _tmp_237_rule(Parser *p) return _res; } -// _tmp_238: ')' | '**' +// _tmp_240: ')' | '**' static void * -_tmp_238_rule(Parser *p) +_tmp_240_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38153,18 +38271,18 @@ _tmp_238_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_238[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); + D(fprintf(stderr, "%*c> _tmp_240[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "')'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 8)) // token=')' ) { - D(fprintf(stderr, "%*c+ _tmp_238[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "')'")); + D(fprintf(stderr, "%*c+ _tmp_240[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "')'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_238[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_240[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "')'")); } { // '**' @@ -38172,18 +38290,18 @@ _tmp_238_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_238[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); + D(fprintf(stderr, "%*c> _tmp_240[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 35)) // token='**' ) { - D(fprintf(stderr, "%*c+ _tmp_238[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**'")); + D(fprintf(stderr, "%*c+ _tmp_240[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_238[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_240[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'**'")); } _res = NULL; @@ -38192,9 +38310,9 @@ _tmp_238_rule(Parser *p) return _res; } -// _tmp_239: ':' | '**' +// _tmp_241: ':' | '**' static void * -_tmp_239_rule(Parser *p) +_tmp_241_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38211,18 +38329,18 @@ _tmp_239_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_239[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); + D(fprintf(stderr, "%*c> _tmp_241[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "':'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 11)) // token=':' ) { - D(fprintf(stderr, "%*c+ _tmp_239[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':'")); + D(fprintf(stderr, "%*c+ _tmp_241[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "':'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_239[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_241[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "':'")); } { // '**' @@ -38230,18 +38348,18 @@ _tmp_239_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_239[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); + D(fprintf(stderr, "%*c> _tmp_241[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'**'")); Token * _literal; if ( (_literal = _PyPegen_expect_token(p, 35)) // token='**' ) { - D(fprintf(stderr, "%*c+ _tmp_239[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**'")); + D(fprintf(stderr, "%*c+ _tmp_241[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'**'")); _res = _literal; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_239[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_241[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'**'")); } _res = NULL; @@ -38250,9 +38368,9 @@ _tmp_239_rule(Parser *p) return _res; } -// _tmp_240: expression ['as' star_target] +// _tmp_242: expression ['as' star_target] static void * -_tmp_240_rule(Parser *p) +_tmp_242_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38269,22 +38387,22 @@ _tmp_240_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_240[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); + D(fprintf(stderr, "%*c> _tmp_242[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings expr_ty expression_var; if ( (expression_var = expression_rule(p)) // expression && - (_opt_var = _tmp_246_rule(p), !p->error_indicator) // ['as' star_target] + (_opt_var = _tmp_250_rule(p), !p->error_indicator) // ['as' star_target] ) { - D(fprintf(stderr, "%*c+ _tmp_240[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); + D(fprintf(stderr, "%*c+ _tmp_242[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); _res = _PyPegen_dummy_name(p, expression_var, _opt_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_240[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_242[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "expression ['as' star_target]")); } _res = NULL; @@ -38293,9 +38411,9 @@ _tmp_240_rule(Parser *p) return _res; } -// _tmp_241: expressions ['as' star_target] +// _tmp_243: expressions ['as' star_target] static void * -_tmp_241_rule(Parser *p) +_tmp_243_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38312,22 +38430,22 @@ _tmp_241_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_241[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); + D(fprintf(stderr, "%*c> _tmp_243[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings expr_ty expressions_var; if ( (expressions_var = expressions_rule(p)) // expressions && - (_opt_var = _tmp_247_rule(p), !p->error_indicator) // ['as' star_target] + (_opt_var = _tmp_251_rule(p), !p->error_indicator) // ['as' star_target] ) { - D(fprintf(stderr, "%*c+ _tmp_241[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); + D(fprintf(stderr, "%*c+ _tmp_243[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); _res = _PyPegen_dummy_name(p, expressions_var, _opt_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_241[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_243[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "expressions ['as' star_target]")); } _res = NULL; @@ -38336,9 +38454,9 @@ _tmp_241_rule(Parser *p) return _res; } -// _tmp_242: expression ['as' star_target] +// _tmp_244: expression ['as' star_target] static void * -_tmp_242_rule(Parser *p) +_tmp_244_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38355,22 +38473,22 @@ _tmp_242_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_242[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); + D(fprintf(stderr, "%*c> _tmp_244[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings expr_ty expression_var; if ( (expression_var = expression_rule(p)) // expression && - (_opt_var = _tmp_248_rule(p), !p->error_indicator) // ['as' star_target] + (_opt_var = _tmp_252_rule(p), !p->error_indicator) // ['as' star_target] ) { - D(fprintf(stderr, "%*c+ _tmp_242[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); + D(fprintf(stderr, "%*c+ _tmp_244[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression ['as' star_target]")); _res = _PyPegen_dummy_name(p, expression_var, _opt_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_242[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_244[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "expression ['as' star_target]")); } _res = NULL; @@ -38379,9 +38497,9 @@ _tmp_242_rule(Parser *p) return _res; } -// _tmp_243: expressions ['as' star_target] +// _tmp_245: expressions ['as' star_target] static void * -_tmp_243_rule(Parser *p) +_tmp_245_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38398,22 +38516,22 @@ _tmp_243_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_243[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); + D(fprintf(stderr, "%*c> _tmp_245[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); void *_opt_var; UNUSED(_opt_var); // Silence compiler warnings expr_ty expressions_var; if ( (expressions_var = expressions_rule(p)) // expressions && - (_opt_var = _tmp_249_rule(p), !p->error_indicator) // ['as' star_target] + (_opt_var = _tmp_253_rule(p), !p->error_indicator) // ['as' star_target] ) { - D(fprintf(stderr, "%*c+ _tmp_243[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); + D(fprintf(stderr, "%*c+ _tmp_245[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expressions ['as' star_target]")); _res = _PyPegen_dummy_name(p, expressions_var, _opt_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_243[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_245[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "expressions ['as' star_target]")); } _res = NULL; @@ -38422,9 +38540,9 @@ _tmp_243_rule(Parser *p) return _res; } -// _tmp_244: 'as' NAME +// _tmp_246: 'as' NAME static void * -_tmp_244_rule(Parser *p) +_tmp_246_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38441,7 +38559,7 @@ _tmp_244_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_244[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); + D(fprintf(stderr, "%*c> _tmp_246[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' NAME")); Token * _keyword; expr_ty name_var; if ( @@ -38450,12 +38568,12 @@ _tmp_244_rule(Parser *p) (name_var = _PyPegen_name_token(p)) // NAME ) { - D(fprintf(stderr, "%*c+ _tmp_244[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' NAME")); + D(fprintf(stderr, "%*c+ _tmp_246[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' NAME")); _res = _PyPegen_dummy_name(p, _keyword, name_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_244[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_246[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'as' NAME")); } _res = NULL; @@ -38464,9 +38582,9 @@ _tmp_244_rule(Parser *p) return _res; } -// _tmp_245: assignment_expression | expression !':=' +// _tmp_247: assignment_expression | expression !':=' static void * -_tmp_245_rule(Parser *p) +_tmp_247_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38483,18 +38601,18 @@ _tmp_245_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_245[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "assignment_expression")); + D(fprintf(stderr, "%*c> _tmp_247[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "assignment_expression")); expr_ty assignment_expression_var; if ( (assignment_expression_var = assignment_expression_rule(p)) // assignment_expression ) { - D(fprintf(stderr, "%*c+ _tmp_245[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "assignment_expression")); + D(fprintf(stderr, "%*c+ _tmp_247[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "assignment_expression")); _res = assignment_expression_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_245[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_247[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "assignment_expression")); } { // expression !':=' @@ -38502,7 +38620,7 @@ _tmp_245_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_245[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression !':='")); + D(fprintf(stderr, "%*c> _tmp_247[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression !':='")); expr_ty expression_var; if ( (expression_var = expression_rule(p)) // expression @@ -38510,12 +38628,12 @@ _tmp_245_rule(Parser *p) _PyPegen_lookahead_with_int(0, _PyPegen_expect_token, p, 53) // token=':=' ) { - D(fprintf(stderr, "%*c+ _tmp_245[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression !':='")); + D(fprintf(stderr, "%*c+ _tmp_247[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression !':='")); _res = expression_var; goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_245[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_247[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "expression !':='")); } _res = NULL; @@ -38524,9 +38642,129 @@ _tmp_245_rule(Parser *p) return _res; } -// _tmp_246: 'as' star_target +// _loop0_249: ',' (starred_expression | (assignment_expression | expression !':=') !'=') +static asdl_seq * +_loop0_249_rule(Parser *p) +{ + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } + if (p->error_indicator) { + p->level--; + return NULL; + } + void *_res = NULL; + int _mark = p->mark; + void **_children = PyMem_Malloc(sizeof(void *)); + if (!_children) { + p->error_indicator = 1; + PyErr_NoMemory(); + p->level--; + return NULL; + } + Py_ssize_t _children_capacity = 1; + Py_ssize_t _n = 0; + { // ',' (starred_expression | (assignment_expression | expression !':=') !'=') + if (p->error_indicator) { + p->level--; + return NULL; + } + D(fprintf(stderr, "%*c> _loop0_249[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "',' (starred_expression | (assignment_expression | expression !':=') !'=')")); + Token * _literal; + void *elem; + while ( + (_literal = _PyPegen_expect_token(p, 12)) // token=',' + && + (elem = _tmp_254_rule(p)) // starred_expression | (assignment_expression | expression !':=') !'=' + ) + { + _res = elem; + if (_res == NULL && PyErr_Occurred()) { + p->error_indicator = 1; + PyMem_Free(_children); + p->level--; + return NULL; + } + if (_n == _children_capacity) { + _children_capacity *= 2; + void **_new_children = PyMem_Realloc(_children, _children_capacity*sizeof(void *)); + if (!_new_children) { + PyMem_Free(_children); + p->error_indicator = 1; + PyErr_NoMemory(); + p->level--; + return NULL; + } + _children = _new_children; + } + _children[_n++] = _res; + _mark = p->mark; + } + p->mark = _mark; + D(fprintf(stderr, "%*c%s _loop0_249[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "',' (starred_expression | (assignment_expression | expression !':=') !'=')")); + } + asdl_seq *_seq = (asdl_seq*)_Py_asdl_generic_seq_new(_n, p->arena); + if (!_seq) { + PyMem_Free(_children); + p->error_indicator = 1; + PyErr_NoMemory(); + p->level--; + return NULL; + } + for (int i = 0; i < _n; i++) asdl_seq_SET_UNTYPED(_seq, i, _children[i]); + PyMem_Free(_children); + p->level--; + return _seq; +} + +// _gather_248: +// | (starred_expression | (assignment_expression | expression !':=') !'=') _loop0_249 +static asdl_seq * +_gather_248_rule(Parser *p) +{ + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } + if (p->error_indicator) { + p->level--; + return NULL; + } + asdl_seq * _res = NULL; + int _mark = p->mark; + { // (starred_expression | (assignment_expression | expression !':=') !'=') _loop0_249 + if (p->error_indicator) { + p->level--; + return NULL; + } + D(fprintf(stderr, "%*c> _gather_248[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(starred_expression | (assignment_expression | expression !':=') !'=') _loop0_249")); + void *elem; + asdl_seq * seq; + if ( + (elem = _tmp_254_rule(p)) // starred_expression | (assignment_expression | expression !':=') !'=' + && + (seq = _loop0_249_rule(p)) // _loop0_249 + ) + { + D(fprintf(stderr, "%*c+ _gather_248[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(starred_expression | (assignment_expression | expression !':=') !'=') _loop0_249")); + _res = _PyPegen_seq_insert_in_front(p, elem, seq); + goto done; + } + p->mark = _mark; + D(fprintf(stderr, "%*c%s _gather_248[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(starred_expression | (assignment_expression | expression !':=') !'=') _loop0_249")); + } + _res = NULL; + done: + p->level--; + return _res; +} + +// _tmp_250: 'as' star_target static void * -_tmp_246_rule(Parser *p) +_tmp_250_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38543,7 +38781,7 @@ _tmp_246_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_246[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c> _tmp_250[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); Token * _keyword; expr_ty star_target_var; if ( @@ -38552,12 +38790,12 @@ _tmp_246_rule(Parser *p) (star_target_var = star_target_rule(p)) // star_target ) { - D(fprintf(stderr, "%*c+ _tmp_246[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c+ _tmp_250[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); _res = _PyPegen_dummy_name(p, _keyword, star_target_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_246[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_250[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'as' star_target")); } _res = NULL; @@ -38566,9 +38804,9 @@ _tmp_246_rule(Parser *p) return _res; } -// _tmp_247: 'as' star_target +// _tmp_251: 'as' star_target static void * -_tmp_247_rule(Parser *p) +_tmp_251_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38585,7 +38823,7 @@ _tmp_247_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_247[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c> _tmp_251[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); Token * _keyword; expr_ty star_target_var; if ( @@ -38594,12 +38832,12 @@ _tmp_247_rule(Parser *p) (star_target_var = star_target_rule(p)) // star_target ) { - D(fprintf(stderr, "%*c+ _tmp_247[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c+ _tmp_251[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); _res = _PyPegen_dummy_name(p, _keyword, star_target_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_247[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_251[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'as' star_target")); } _res = NULL; @@ -38608,9 +38846,9 @@ _tmp_247_rule(Parser *p) return _res; } -// _tmp_248: 'as' star_target +// _tmp_252: 'as' star_target static void * -_tmp_248_rule(Parser *p) +_tmp_252_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38627,7 +38865,7 @@ _tmp_248_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_248[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c> _tmp_252[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); Token * _keyword; expr_ty star_target_var; if ( @@ -38636,12 +38874,12 @@ _tmp_248_rule(Parser *p) (star_target_var = star_target_rule(p)) // star_target ) { - D(fprintf(stderr, "%*c+ _tmp_248[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c+ _tmp_252[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); _res = _PyPegen_dummy_name(p, _keyword, star_target_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_248[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_252[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'as' star_target")); } _res = NULL; @@ -38650,9 +38888,9 @@ _tmp_248_rule(Parser *p) return _res; } -// _tmp_249: 'as' star_target +// _tmp_253: 'as' star_target static void * -_tmp_249_rule(Parser *p) +_tmp_253_rule(Parser *p) { if (p->level++ == MAXSTACK) { p->error_indicator = 1; @@ -38669,7 +38907,7 @@ _tmp_249_rule(Parser *p) p->level--; return NULL; } - D(fprintf(stderr, "%*c> _tmp_249[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c> _tmp_253[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "'as' star_target")); Token * _keyword; expr_ty star_target_var; if ( @@ -38678,12 +38916,12 @@ _tmp_249_rule(Parser *p) (star_target_var = star_target_rule(p)) // star_target ) { - D(fprintf(stderr, "%*c+ _tmp_249[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); + D(fprintf(stderr, "%*c+ _tmp_253[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "'as' star_target")); _res = _PyPegen_dummy_name(p, _keyword, star_target_var); goto done; } p->mark = _mark; - D(fprintf(stderr, "%*c%s _tmp_249[%d-%d]: %s failed!\n", p->level, ' ', + D(fprintf(stderr, "%*c%s _tmp_253[%d-%d]: %s failed!\n", p->level, ' ', p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "'as' star_target")); } _res = NULL; @@ -38692,6 +38930,126 @@ _tmp_249_rule(Parser *p) return _res; } +// _tmp_254: starred_expression | (assignment_expression | expression !':=') !'=' +static void * +_tmp_254_rule(Parser *p) +{ + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } + if (p->error_indicator) { + p->level--; + return NULL; + } + void * _res = NULL; + int _mark = p->mark; + { // starred_expression + if (p->error_indicator) { + p->level--; + return NULL; + } + D(fprintf(stderr, "%*c> _tmp_254[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "starred_expression")); + expr_ty starred_expression_var; + if ( + (starred_expression_var = starred_expression_rule(p)) // starred_expression + ) + { + D(fprintf(stderr, "%*c+ _tmp_254[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "starred_expression")); + _res = starred_expression_var; + goto done; + } + p->mark = _mark; + D(fprintf(stderr, "%*c%s _tmp_254[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "starred_expression")); + } + { // (assignment_expression | expression !':=') !'=' + if (p->error_indicator) { + p->level--; + return NULL; + } + D(fprintf(stderr, "%*c> _tmp_254[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "(assignment_expression | expression !':=') !'='")); + void *_tmp_255_var; + if ( + (_tmp_255_var = _tmp_255_rule(p)) // assignment_expression | expression !':=' + && + _PyPegen_lookahead_with_int(0, _PyPegen_expect_token, p, 22) // token='=' + ) + { + D(fprintf(stderr, "%*c+ _tmp_254[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "(assignment_expression | expression !':=') !'='")); + _res = _tmp_255_var; + goto done; + } + p->mark = _mark; + D(fprintf(stderr, "%*c%s _tmp_254[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "(assignment_expression | expression !':=') !'='")); + } + _res = NULL; + done: + p->level--; + return _res; +} + +// _tmp_255: assignment_expression | expression !':=' +static void * +_tmp_255_rule(Parser *p) +{ + if (p->level++ == MAXSTACK) { + p->error_indicator = 1; + PyErr_NoMemory(); + } + if (p->error_indicator) { + p->level--; + return NULL; + } + void * _res = NULL; + int _mark = p->mark; + { // assignment_expression + if (p->error_indicator) { + p->level--; + return NULL; + } + D(fprintf(stderr, "%*c> _tmp_255[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "assignment_expression")); + expr_ty assignment_expression_var; + if ( + (assignment_expression_var = assignment_expression_rule(p)) // assignment_expression + ) + { + D(fprintf(stderr, "%*c+ _tmp_255[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "assignment_expression")); + _res = assignment_expression_var; + goto done; + } + p->mark = _mark; + D(fprintf(stderr, "%*c%s _tmp_255[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "assignment_expression")); + } + { // expression !':=' + if (p->error_indicator) { + p->level--; + return NULL; + } + D(fprintf(stderr, "%*c> _tmp_255[%d-%d]: %s\n", p->level, ' ', _mark, p->mark, "expression !':='")); + expr_ty expression_var; + if ( + (expression_var = expression_rule(p)) // expression + && + _PyPegen_lookahead_with_int(0, _PyPegen_expect_token, p, 53) // token=':=' + ) + { + D(fprintf(stderr, "%*c+ _tmp_255[%d-%d]: %s succeeded!\n", p->level, ' ', _mark, p->mark, "expression !':='")); + _res = expression_var; + goto done; + } + p->mark = _mark; + D(fprintf(stderr, "%*c%s _tmp_255[%d-%d]: %s failed!\n", p->level, ' ', + p->error_indicator ? "ERROR!" : "-", _mark, p->mark, "expression !':='")); + } + _res = NULL; + done: + p->level--; + return _res; +} + void * _PyPegen_parse(Parser *p) { From 8e3d90c332b3e9ee84f5cd3d47f98e3c77b0f286 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 18 Oct 2023 15:25:04 +0200 Subject: [PATCH 097/265] [3.11] gh-111015: Install IDLE.app and Python Launcher.app on macOS with correct permissions (gh-111038) (cherry picked from commit cb1bf89c4066f30c80f7d1193b586a2ff8c40579) Co-authored-by: Joshua Root Co-authored-by: Ned Deily --- Mac/Makefile.in | 2 ++ Mac/PythonLauncher/Makefile.in | 2 ++ .../next/macOS/2023-10-18-01-40-36.gh-issue-111015.NaLI2L.rst | 1 + 3 files changed, 5 insertions(+) create mode 100644 Misc/NEWS.d/next/macOS/2023-10-18-01-40-36.gh-issue-111015.NaLI2L.rst diff --git a/Mac/Makefile.in b/Mac/Makefile.in index f9691288414538..a9c126bae5b336 100644 --- a/Mac/Makefile.in +++ b/Mac/Makefile.in @@ -260,6 +260,8 @@ install_IDLE: rm "$(DESTDIR)$(LIBDEST)/idlelib/config-extensions.def~" ; \ fi touch "$(DESTDIR)$(PYTHONAPPSDIR)/IDLE.app" + chmod -R ugo+rX,go-w "$(DESTDIR)$(PYTHONAPPSDIR)/IDLE.app" + chmod ugo+x "$(DESTDIR)$(PYTHONAPPSDIR)/IDLE.app/Contents/MacOS/IDLE" $(INSTALLED_PYTHONAPP): install_Python diff --git a/Mac/PythonLauncher/Makefile.in b/Mac/PythonLauncher/Makefile.in index 4c05f26e8358bc..9decadbdc60f00 100644 --- a/Mac/PythonLauncher/Makefile.in +++ b/Mac/PythonLauncher/Makefile.in @@ -27,6 +27,8 @@ install: Python\ Launcher.app -test -d "$(DESTDIR)$(PYTHONAPPSDIR)/Python Launcher.app" && rm -r "$(DESTDIR)$(PYTHONAPPSDIR)/Python Launcher.app" /bin/cp -r "Python Launcher.app" "$(DESTDIR)$(PYTHONAPPSDIR)" touch "$(DESTDIR)$(PYTHONAPPSDIR)/Python Launcher.app" + chmod -R ugo+rX,go-w "$(DESTDIR)$(PYTHONAPPSDIR)/Python Launcher.app" + chmod ugo+x "$(DESTDIR)$(PYTHONAPPSDIR)/Python Launcher.app/Contents/MacOS/Python Launcher" clean: rm -f *.o "Python Launcher" diff --git a/Misc/NEWS.d/next/macOS/2023-10-18-01-40-36.gh-issue-111015.NaLI2L.rst b/Misc/NEWS.d/next/macOS/2023-10-18-01-40-36.gh-issue-111015.NaLI2L.rst new file mode 100644 index 00000000000000..4c6eea136554c8 --- /dev/null +++ b/Misc/NEWS.d/next/macOS/2023-10-18-01-40-36.gh-issue-111015.NaLI2L.rst @@ -0,0 +1 @@ +Ensure that IDLE.app and Python Launcher.app are installed with appropriate permissions on macOS builds. From faa7c207bfe6b04c88f8440d95c36739ae69d518 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 18 Oct 2023 22:34:20 +0200 Subject: [PATCH 098/265] [3.11] GH-104232: Fix statement about trace return values (GH-111045) (cherry picked from commit d9246c7b734b8958da03494045208681d95f5b74) --- Doc/library/sys.rst | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/Doc/library/sys.rst b/Doc/library/sys.rst index 34cb4cf038fe7e..65c20748f862b5 100644 --- a/Doc/library/sys.rst +++ b/Doc/library/sys.rst @@ -1520,9 +1520,8 @@ always available. function to be used for the new scope, or ``None`` if the scope shouldn't be traced. - The local trace function should return a reference to itself (or to another - function for further tracing in that scope), or ``None`` to turn off tracing - in that scope. + The local trace function should return a reference to itself, or to another + function which would then be used as the local trace function for the scope. If there is any error occurred in the trace function, it will be unset, just like ``settrace(None)`` is called. From fabfc2ccfc50e58284da48db9cb63f9dd58606a6 Mon Sep 17 00:00:00 2001 From: Tian Gao Date: Wed, 18 Oct 2023 16:42:36 -0700 Subject: [PATCH 099/265] [3.11] GH-65052: Prevent pdb from crashing when trying to display objects (GH-111002) (cherry picked from commit c523ce0f434582580a3721e15cb7dd6b56ad0236) --- Lib/pdb.py | 23 ++++++--- Lib/test/test_pdb.py | 47 +++++++++++++++++++ ...3-10-09-19-09-32.gh-issue-65052.C2mRlo.rst | 1 + 3 files changed, 64 insertions(+), 7 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-10-09-19-09-32.gh-issue-65052.C2mRlo.rst diff --git a/Lib/pdb.py b/Lib/pdb.py index fe9eab9b5e1ff0..5dfe5077654c21 100755 --- a/Lib/pdb.py +++ b/Lib/pdb.py @@ -405,8 +405,9 @@ def preloop(self): # fields are changed to be displayed if newvalue is not oldvalue and newvalue != oldvalue: displaying[expr] = newvalue - self.message('display %s: %r [old: %r]' % - (expr, newvalue, oldvalue)) + self.message('display %s: %s [old: %s]' % + (expr, self._safe_repr(newvalue, expr), + self._safe_repr(oldvalue, expr))) def interaction(self, frame, traceback): # Restore the previous signal handler at the Pdb prompt. @@ -1221,7 +1222,7 @@ def do_args(self, arg): for i in range(n): name = co.co_varnames[i] if name in dict: - self.message('%s = %r' % (name, dict[name])) + self.message('%s = %s' % (name, self._safe_repr(dict[name], name))) else: self.message('%s = *** undefined ***' % (name,)) do_a = do_args @@ -1231,7 +1232,7 @@ def do_retval(self, arg): Print the return value for the last return of a function. """ if '__return__' in self.curframe_locals: - self.message(repr(self.curframe_locals['__return__'])) + self.message(self._safe_repr(self.curframe_locals['__return__'], "retval")) else: self.error('Not yet returned!') do_rv = do_retval @@ -1268,6 +1269,12 @@ def _msg_val_func(self, arg, func): except: self._error_exc() + def _safe_repr(self, obj, expr): + try: + return repr(obj) + except Exception as e: + return _rstr(f"*** repr({expr}) failed: {self._format_exc(e)} ***") + def do_p(self, arg): """p expression Print the value of the expression. @@ -1438,12 +1445,12 @@ def do_display(self, arg): """ if not arg: self.message('Currently displaying:') - for item in self.displaying.get(self.curframe, {}).items(): - self.message('%s: %r' % item) + for key, val in self.displaying.get(self.curframe, {}).items(): + self.message('%s: %s' % (key, self._safe_repr(val, key))) else: val = self._getval_except(arg) self.displaying.setdefault(self.curframe, {})[arg] = val - self.message('display %s: %r' % (arg, val)) + self.message('display %s: %s' % (arg, self._safe_repr(val, arg))) complete_display = _complete_expression @@ -1645,6 +1652,8 @@ def _run(self, target: Union[_ModuleTarget, _ScriptTarget]): self.run(target.code) + def _format_exc(self, exc: BaseException): + return traceback.format_exception_only(exc)[-1].strip() def _getsourcelines(self, obj): # GH-103319 diff --git a/Lib/test/test_pdb.py b/Lib/test/test_pdb.py index 22fc2b35819466..8a0350b8590d8e 100644 --- a/Lib/test/test_pdb.py +++ b/Lib/test/test_pdb.py @@ -1704,6 +1704,53 @@ def test_pdb_issue_gh_103225(): (Pdb) continue """ +def test_pdb_issue_gh_65052(): + """See GH-65052 + + args, retval and display should not crash if the object is not displayable + >>> class A: + ... def __new__(cls): + ... import pdb; pdb.Pdb(nosigint=True, readrc=False).set_trace() + ... return object.__new__(cls) + ... def __init__(self): + ... import pdb; pdb.Pdb(nosigint=True, readrc=False).set_trace() + ... self.a = 1 + ... def __repr__(self): + ... return self.a + + >>> def test_function(): + ... A() + >>> with PdbTestInput([ # doctest: +ELLIPSIS +NORMALIZE_WHITESPACE + ... 's', + ... 'retval', + ... 'continue', + ... 'args', + ... 'display self', + ... 'display', + ... 'continue', + ... ]): + ... test_function() + > (4)__new__() + -> return object.__new__(cls) + (Pdb) s + --Return-- + > (4)__new__()-> + -> return object.__new__(cls) + (Pdb) retval + *** repr(retval) failed: AttributeError: 'A' object has no attribute 'a' *** + (Pdb) continue + > (7)__init__() + -> self.a = 1 + (Pdb) args + self = *** repr(self) failed: AttributeError: 'A' object has no attribute 'a' *** + (Pdb) display self + display self: *** repr(self) failed: AttributeError: 'A' object has no attribute 'a' *** + (Pdb) display + Currently displaying: + self: *** repr(self) failed: AttributeError: 'A' object has no attribute 'a' *** + (Pdb) continue + """ + @support.requires_subprocess() class PdbTestCase(unittest.TestCase): diff --git a/Misc/NEWS.d/next/Library/2023-10-09-19-09-32.gh-issue-65052.C2mRlo.rst b/Misc/NEWS.d/next/Library/2023-10-09-19-09-32.gh-issue-65052.C2mRlo.rst new file mode 100644 index 00000000000000..4739c63bb3cc9d --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-10-09-19-09-32.gh-issue-65052.C2mRlo.rst @@ -0,0 +1 @@ +Prevent :mod:`pdb` from crashing when trying to display undisplayable objects From 2258d6cfa2fcd0c290328e81b68aff25abf8c3d8 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 19 Oct 2023 06:30:43 +0200 Subject: [PATCH 100/265] [3.11] gh-111050: IDLE - Simplify configdialog.HighPage.theme_elements (GH-111053) (#111056) gh-111050: IDLE - Simplify configdialog.HighPage.theme_elements (GH-111053) Replace tuple value with internal name, removing numbers. Remove sorting of already ordered dislay names. Remove '[0]' indexing into now-gone tuple. (cherry picked from commit 642eb8df951f2f1d4bf4d93ee568707c5bf40a96) Co-authored-by: Terry Jan Reedy --- Lib/idlelib/configdialog.py | 44 +++++++++++----------- Lib/idlelib/idle_test/test_configdialog.py | 2 +- 2 files changed, 22 insertions(+), 24 deletions(-) diff --git a/Lib/idlelib/configdialog.py b/Lib/idlelib/configdialog.py index d56afe8a807fea..eedf97bf74fe6a 100644 --- a/Lib/idlelib/configdialog.py +++ b/Lib/idlelib/configdialog.py @@ -576,24 +576,23 @@ def create_page_highlight(self): (*)theme_message: Label """ self.theme_elements = { - # Display_name: ('internal_name, sort_number'). - # TODO: remove sort_number unneeded with dict ordering. - 'Normal Code or Text': ('normal', '00'), - 'Code Context': ('context', '01'), - 'Python Keywords': ('keyword', '02'), - 'Python Definitions': ('definition', '03'), - 'Python Builtins': ('builtin', '04'), - 'Python Comments': ('comment', '05'), - 'Python Strings': ('string', '06'), - 'Selected Text': ('hilite', '07'), - 'Found Text': ('hit', '08'), - 'Cursor': ('cursor', '09'), - 'Editor Breakpoint': ('break', '10'), - 'Shell Prompt': ('console', '11'), - 'Error Text': ('error', '12'), - 'Shell User Output': ('stdout', '13'), - 'Shell User Exception': ('stderr', '14'), - 'Line Number': ('linenumber', '16'), + # Display-name: internal-config-tag-name. + 'Normal Code or Text': 'normal', + 'Code Context': 'context', + 'Python Keywords': 'keyword', + 'Python Definitions': 'definition', + 'Python Builtins': 'builtin', + 'Python Comments': 'comment', + 'Python Strings': 'string', + 'Selected Text': 'hilite', + 'Found Text': 'hit', + 'Cursor': 'cursor', + 'Editor Breakpoint': 'break', + 'Shell Prompt': 'console', + 'Error Text': 'error', + 'Shell User Output': 'stdout', + 'Shell User Exception': 'stderr', + 'Line Number': 'linenumber', } self.builtin_name = tracers.add( StringVar(self), self.var_changed_builtin_name) @@ -653,7 +652,7 @@ def tem(event, elem=element): # event.widget.winfo_top_level().highlight_target.set(elem) self.highlight_target.set(elem) text.tag_bind( - self.theme_elements[element][0], '', tem) + self.theme_elements[element], '', tem) text['state'] = 'disabled' self.style.configure('frame_color_set.TFrame', borderwidth=1, relief='solid') @@ -761,7 +760,6 @@ def load_theme_cfg(self): self.set_theme_type() # Load theme element option menu. theme_names = list(self.theme_elements) - theme_names.sort(key=lambda x: self.theme_elements[x][1]) self.targetlist.SetMenu(theme_names, theme_names[0]) self.paint_theme_sample() self.set_highlight_target() @@ -888,7 +886,7 @@ def on_new_color_set(self): new_color = self.color.get() self.style.configure('frame_color_set.TFrame', background=new_color) plane = 'foreground' if self.fg_bg_toggle.get() else 'background' - sample_element = self.theme_elements[self.highlight_target.get()][0] + sample_element = self.theme_elements[self.highlight_target.get()] self.highlight_sample.tag_config(sample_element, **{plane: new_color}) theme = self.custom_name.get() theme_element = sample_element + '-' + plane @@ -1002,7 +1000,7 @@ def set_color_sample(self): frame_color_set """ # Set the color sample area. - tag = self.theme_elements[self.highlight_target.get()][0] + tag = self.theme_elements[self.highlight_target.get()] plane = 'foreground' if self.fg_bg_toggle.get() else 'background' color = self.highlight_sample.tag_cget(tag, plane) self.style.configure('frame_color_set.TFrame', background=color) @@ -1032,7 +1030,7 @@ def paint_theme_sample(self): else: # User theme theme = self.custom_name.get() for element_title in self.theme_elements: - element = self.theme_elements[element_title][0] + element = self.theme_elements[element_title] colors = idleConf.GetHighlight(theme, element) if element == 'cursor': # Cursor sample needs special painting. colors['background'] = idleConf.GetHighlight( diff --git a/Lib/idlelib/idle_test/test_configdialog.py b/Lib/idlelib/idle_test/test_configdialog.py index e5d5b4013fca57..6f8518a9bb19d0 100644 --- a/Lib/idlelib/idle_test/test_configdialog.py +++ b/Lib/idlelib/idle_test/test_configdialog.py @@ -430,7 +430,7 @@ def test_highlight_target_text_mouse(self): def tag_to_element(elem): for element, tag in d.theme_elements.items(): - elem[tag[0]] = element + elem[tag] = element def click_it(start): x, y, dx, dy = hs.bbox(start) From 26a028269b807d599ea2f5fa266fccfd1421775b Mon Sep 17 00:00:00 2001 From: Radislav Chugunov <52372310+chgnrdv@users.noreply.github.com> Date: Thu, 19 Oct 2023 16:26:40 +0300 Subject: [PATCH 101/265] [3.11] gh-108791: Fix pdb CLI invalid argument handling (GH-108816) (#111063) * [3.11] gh-108791: Fix `pdb` CLI invalid argument handling (GH-108816) (cherry picked from commit 162213f2db3835e1115178d38741544f4b4db416) Co-authored-by: Radislav Chugunov <52372310+chgnrdv@users.noreply.github.com> --- Lib/pdb.py | 6 ++++++ Lib/test/test_pdb.py | 19 +++++++++++++++++-- ...-09-02-16-07-23.gh-issue-108791.fBcAqh.rst | 1 + 3 files changed, 24 insertions(+), 2 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-09-02-16-07-23.gh-issue-108791.fBcAqh.rst diff --git a/Lib/pdb.py b/Lib/pdb.py index 5dfe5077654c21..4a4a0b9d90f6d2 100755 --- a/Lib/pdb.py +++ b/Lib/pdb.py @@ -136,6 +136,9 @@ def check(self): if not os.path.exists(self): print('Error:', self.orig, 'does not exist') sys.exit(1) + if os.path.isdir(self): + print('Error:', self.orig, 'is a directory') + sys.exit(1) # Replace pdb's dir with script's dir in front of module search path. sys.path[0] = os.path.dirname(self) @@ -162,6 +165,9 @@ class _ModuleTarget(str): def check(self): try: self._details + except ImportError as e: + print(f"ImportError: {e}") + sys.exit(1) except Exception: traceback.print_exc() sys.exit(1) diff --git a/Lib/test/test_pdb.py b/Lib/test/test_pdb.py index 8a0350b8590d8e..5f9bcc5af82ad6 100644 --- a/Lib/test/test_pdb.py +++ b/Lib/test/test_pdb.py @@ -2159,8 +2159,7 @@ def test_module_without_a_main(self): stdout, stderr = self._run_pdb( ['-m', module_name], "", expected_returncode=1 ) - self.assertIn("ImportError: No module named t_main.__main__", - stdout.splitlines()) + self.assertIn("ImportError: No module named t_main.__main__;", stdout) def test_package_without_a_main(self): pkg_name = 't_pkg' @@ -2178,6 +2177,22 @@ def test_package_without_a_main(self): "'t_pkg.t_main' is a package and cannot be directly executed", stdout) + def test_nonexistent_module(self): + assert not os.path.exists(os_helper.TESTFN) + stdout, stderr = self._run_pdb(["-m", os_helper.TESTFN], "", expected_returncode=1) + self.assertIn(f"ImportError: No module named {os_helper.TESTFN}", stdout) + + def test_dir_as_script(self): + with os_helper.temp_dir() as temp_dir: + stdout, stderr = self._run_pdb([temp_dir], "", expected_returncode=1) + self.assertIn(f"Error: {temp_dir} is a directory", stdout) + + def test_invalid_cmd_line_options(self): + stdout, stderr = self._run_pdb(["-c"], "", expected_returncode=1) + self.assertIn(f"Error: option -c requires argument", stdout) + stdout, stderr = self._run_pdb(["--spam"], "", expected_returncode=1) + self.assertIn(f"Error: option --spam not recognized", stdout) + def test_blocks_at_first_code_line(self): script = """ #This is a comment, on line 2 diff --git a/Misc/NEWS.d/next/Library/2023-09-02-16-07-23.gh-issue-108791.fBcAqh.rst b/Misc/NEWS.d/next/Library/2023-09-02-16-07-23.gh-issue-108791.fBcAqh.rst new file mode 100644 index 00000000000000..84a2cd589e10d5 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-09-02-16-07-23.gh-issue-108791.fBcAqh.rst @@ -0,0 +1 @@ +Improved error handling in :mod:`pdb` command line interface, making it produce more concise error messages. From 4fc5352a0d52d8077b4c2ad03c963a99b1fbe790 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 19 Oct 2023 17:31:29 +0200 Subject: [PATCH 102/265] [3.11] gh-101100: Fix sphinx warnings in `library/getpass.rst` (GH-110461) (#111072) Co-authored-by: Nikita Sobolev --- Doc/library/getpass.rst | 2 +- Doc/tools/.nitignore | 1 - 2 files changed, 1 insertion(+), 2 deletions(-) diff --git a/Doc/library/getpass.rst b/Doc/library/getpass.rst index d5bbe67fb30a62..5c79daf0f47d8e 100644 --- a/Doc/library/getpass.rst +++ b/Doc/library/getpass.rst @@ -43,7 +43,7 @@ The :mod:`getpass` module provides two functions: Return the "login name" of the user. This function checks the environment variables :envvar:`LOGNAME`, - :envvar:`USER`, :envvar:`LNAME` and :envvar:`USERNAME`, in order, and + :envvar:`USER`, :envvar:`!LNAME` and :envvar:`USERNAME`, in order, and returns the value of the first one which is set to a non-empty string. If none are set, the login name from the password database is returned on systems which support the :mod:`pwd` module, otherwise, an exception is diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index 0ff65ccb21d378..b47924f25348fa 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -71,7 +71,6 @@ Doc/library/fcntl.rst Doc/library/ftplib.rst Doc/library/functions.rst Doc/library/functools.rst -Doc/library/getpass.rst Doc/library/gettext.rst Doc/library/gzip.rst Doc/library/http.client.rst From 1b60244b97f9181d943bc834c83fc5f9440df16b Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 19 Oct 2023 17:33:12 +0200 Subject: [PATCH 103/265] [3.11] GH-101100: Fix reference warnings for ``__getitem__`` (GH-110118) (#111074) Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com> --- Doc/glossary.rst | 8 ++++---- Doc/library/abc.rst | 2 +- Doc/library/collections.abc.rst | 8 ++++---- Doc/library/collections.rst | 6 +++--- Doc/library/email.compat32-message.rst | 2 +- Doc/library/email.message.rst | 2 +- Doc/library/functions.rst | 4 ++-- Doc/library/mailbox.rst | 2 +- Doc/library/operator.rst | 4 ++-- Doc/library/stdtypes.rst | 2 +- Doc/library/unittest.mock.rst | 2 +- Doc/library/wsgiref.rst | 4 ++-- Doc/library/xml.dom.pulldom.rst | 2 +- Doc/reference/compound_stmts.rst | 2 +- Doc/reference/expressions.rst | 6 +++--- Doc/whatsnew/2.2.rst | 10 +++++----- Doc/whatsnew/2.3.rst | 4 ++-- Doc/whatsnew/3.8.rst | 2 +- Misc/NEWS.d/3.11.0a1.rst | 2 +- Misc/NEWS.d/3.8.0a1.rst | 2 +- 20 files changed, 38 insertions(+), 38 deletions(-) diff --git a/Doc/glossary.rst b/Doc/glossary.rst index 81599477fc9534..7c91a0a8184e6b 100644 --- a/Doc/glossary.rst +++ b/Doc/glossary.rst @@ -645,7 +645,7 @@ Glossary iterables include all sequence types (such as :class:`list`, :class:`str`, and :class:`tuple`) and some non-sequence types like :class:`dict`, :term:`file objects `, and objects of any classes you define - with an :meth:`__iter__` method or with a :meth:`__getitem__` method + with an :meth:`__iter__` method or with a :meth:`~object.__getitem__` method that implements :term:`sequence` semantics. Iterables can be @@ -1085,17 +1085,17 @@ Glossary sequence An :term:`iterable` which supports efficient element access using integer - indices via the :meth:`__getitem__` special method and defines a + indices via the :meth:`~object.__getitem__` special method and defines a :meth:`__len__` method that returns the length of the sequence. Some built-in sequence types are :class:`list`, :class:`str`, :class:`tuple`, and :class:`bytes`. Note that :class:`dict` also - supports :meth:`__getitem__` and :meth:`__len__`, but is considered a + supports :meth:`~object.__getitem__` and :meth:`__len__`, but is considered a mapping rather than a sequence because the lookups use arbitrary :term:`immutable` keys rather than integers. The :class:`collections.abc.Sequence` abstract base class defines a much richer interface that goes beyond just - :meth:`__getitem__` and :meth:`__len__`, adding :meth:`count`, + :meth:`~object.__getitem__` and :meth:`__len__`, adding :meth:`count`, :meth:`index`, :meth:`__contains__`, and :meth:`__reversed__`. Types that implement this expanded interface can be registered explicitly using diff --git a/Doc/library/abc.rst b/Doc/library/abc.rst index 274b2d69f0ab5c..fb4f9da169c5ab 100644 --- a/Doc/library/abc.rst +++ b/Doc/library/abc.rst @@ -154,7 +154,7 @@ a helper class :class:`ABC` to alternatively define ABCs through inheritance: Finally, the last line makes ``Foo`` a virtual subclass of ``MyIterable``, even though it does not define an :meth:`~iterator.__iter__` method (it uses the old-style iterable protocol, defined in terms of :meth:`__len__` and - :meth:`__getitem__`). Note that this will not make ``get_iterator`` + :meth:`~object.__getitem__`). Note that this will not make ``get_iterator`` available as a method of ``Foo``, so it is provided separately. diff --git a/Doc/library/collections.abc.rst b/Doc/library/collections.abc.rst index 1ada0d352a0cc6..e4695b44320600 100644 --- a/Doc/library/collections.abc.rst +++ b/Doc/library/collections.abc.rst @@ -191,7 +191,7 @@ ABC Inherits from Abstract Methods Mi .. [2] Checking ``isinstance(obj, Iterable)`` detects classes that are registered as :class:`Iterable` or that have an :meth:`__iter__` method, but it does not detect classes that iterate with the - :meth:`__getitem__` method. The only reliable way to determine + :meth:`~object.__getitem__` method. The only reliable way to determine whether an object is :term:`iterable` is to call ``iter(obj)``. @@ -221,7 +221,7 @@ Collections Abstract Base Classes -- Detailed Descriptions Checking ``isinstance(obj, Iterable)`` detects classes that are registered as :class:`Iterable` or that have an :meth:`__iter__` method, but it does - not detect classes that iterate with the :meth:`__getitem__` method. + not detect classes that iterate with the :meth:`~object.__getitem__` method. The only reliable way to determine whether an object is :term:`iterable` is to call ``iter(obj)``. @@ -261,8 +261,8 @@ Collections Abstract Base Classes -- Detailed Descriptions Implementation note: Some of the mixin methods, such as :meth:`__iter__`, :meth:`__reversed__` and :meth:`index`, make - repeated calls to the underlying :meth:`__getitem__` method. - Consequently, if :meth:`__getitem__` is implemented with constant + repeated calls to the underlying :meth:`~object.__getitem__` method. + Consequently, if :meth:`~object.__getitem__` is implemented with constant access speed, the mixin methods will have linear performance; however, if the underlying method is linear (as it would be with a linked list), the mixins will have quadratic performance and will diff --git a/Doc/library/collections.rst b/Doc/library/collections.rst index 5ea3a5dd6254c7..a29cc9530390bc 100644 --- a/Doc/library/collections.rst +++ b/Doc/library/collections.rst @@ -742,12 +742,12 @@ stack manipulations such as ``dup``, ``drop``, ``swap``, ``over``, ``pick``, If calling :attr:`default_factory` raises an exception this exception is propagated unchanged. - This method is called by the :meth:`__getitem__` method of the + This method is called by the :meth:`~object.__getitem__` method of the :class:`dict` class when the requested key is not found; whatever it - returns or raises is then returned or raised by :meth:`__getitem__`. + returns or raises is then returned or raised by :meth:`~object.__getitem__`. Note that :meth:`__missing__` is *not* called for any operations besides - :meth:`__getitem__`. This means that :meth:`get` will, like normal + :meth:`~object.__getitem__`. This means that :meth:`get` will, like normal dictionaries, return ``None`` as a default rather than using :attr:`default_factory`. diff --git a/Doc/library/email.compat32-message.rst b/Doc/library/email.compat32-message.rst index 5bef155a4af310..c4c322a82e1f44 100644 --- a/Doc/library/email.compat32-message.rst +++ b/Doc/library/email.compat32-message.rst @@ -367,7 +367,7 @@ Here are the methods of the :class:`Message` class: .. method:: get(name, failobj=None) Return the value of the named header field. This is identical to - :meth:`__getitem__` except that optional *failobj* is returned if the + :meth:`~object.__getitem__` except that optional *failobj* is returned if the named header is missing (defaults to ``None``). Here are some additional useful methods: diff --git a/Doc/library/email.message.rst b/Doc/library/email.message.rst index 225f498781fa86..f58d93da6ed687 100644 --- a/Doc/library/email.message.rst +++ b/Doc/library/email.message.rst @@ -247,7 +247,7 @@ message objects. .. method:: get(name, failobj=None) Return the value of the named header field. This is identical to - :meth:`__getitem__` except that optional *failobj* is returned if the + :meth:`~object.__getitem__` except that optional *failobj* is returned if the named header is missing (*failobj* defaults to ``None``). diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst index 5e5fd467058edf..c287f52156ba63 100644 --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -982,7 +982,7 @@ are always available. They are listed here in alphabetical order. differently depending on the presence of the second argument. Without a second argument, *object* must be a collection object which supports the :term:`iterable` protocol (the :meth:`__iter__` method), or it must support - the sequence protocol (the :meth:`__getitem__` method with integer arguments + the sequence protocol (the :meth:`~object.__getitem__` method with integer arguments starting at ``0``). If it does not support either of those protocols, :exc:`TypeError` is raised. If the second argument, *sentinel*, is given, then *object* must be a callable object. The iterator created in this case @@ -1562,7 +1562,7 @@ are always available. They are listed here in alphabetical order. Return a reverse :term:`iterator`. *seq* must be an object which has a :meth:`__reversed__` method or supports the sequence protocol (the - :meth:`__len__` method and the :meth:`__getitem__` method with integer + :meth:`__len__` method and the :meth:`~object.__getitem__` method with integer arguments starting at ``0``). diff --git a/Doc/library/mailbox.rst b/Doc/library/mailbox.rst index 91df07d914cae2..b27deb20f13236 100644 --- a/Doc/library/mailbox.rst +++ b/Doc/library/mailbox.rst @@ -167,7 +167,7 @@ Supported mailbox formats are Maildir, mbox, MH, Babyl, and MMDF. Return a representation of the message corresponding to *key*. If no such message exists, *default* is returned if the method was called as :meth:`get` and a :exc:`KeyError` exception is raised if the method was - called as :meth:`__getitem__`. The message is represented as an instance + called as :meth:`~object.__getitem__`. The message is represented as an instance of the appropriate format-specific :class:`Message` subclass unless a custom message factory was specified when the :class:`Mailbox` instance was initialized. diff --git a/Doc/library/operator.rst b/Doc/library/operator.rst index 57c67bcf3aa12e..96f2c287875d41 100644 --- a/Doc/library/operator.rst +++ b/Doc/library/operator.rst @@ -306,7 +306,7 @@ expect a function argument. itemgetter(*items) Return a callable object that fetches *item* from its operand using the - operand's :meth:`__getitem__` method. If multiple items are specified, + operand's :meth:`~object.__getitem__` method. If multiple items are specified, returns a tuple of lookup values. For example: * After ``f = itemgetter(2)``, the call ``f(r)`` returns ``r[2]``. @@ -326,7 +326,7 @@ expect a function argument. return tuple(obj[item] for item in items) return g - The items can be any type accepted by the operand's :meth:`__getitem__` + The items can be any type accepted by the operand's :meth:`~object.__getitem__` method. Dictionaries accept any :term:`hashable` value. Lists, tuples, and strings accept an index or a slice: diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst index e9ecd7821ec837..ad3135cbc146ed 100644 --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -2220,7 +2220,7 @@ expression support in the :mod:`re` module). Return a copy of the string in which each character has been mapped through the given translation table. The table must be an object that implements - indexing via :meth:`__getitem__`, typically a :term:`mapping` or + indexing via :meth:`~object.__getitem__`, typically a :term:`mapping` or :term:`sequence`. When indexed by a Unicode ordinal (an integer), the table object can do any of the following: return a Unicode ordinal or a string, to map the character to one or more other characters; return diff --git a/Doc/library/unittest.mock.rst b/Doc/library/unittest.mock.rst index d3398e9407d40f..dee5c9e88a0a44 100644 --- a/Doc/library/unittest.mock.rst +++ b/Doc/library/unittest.mock.rst @@ -1656,7 +1656,7 @@ Keywords can be used in the :func:`patch.dict` call to set values in the diction :func:`patch.dict` can be used with dictionary like objects that aren't actually dictionaries. At the very minimum they must support item getting, setting, deleting and either iteration or membership test. This corresponds to the -magic methods :meth:`__getitem__`, :meth:`__setitem__`, :meth:`__delitem__` and either +magic methods :meth:`~object.__getitem__`, :meth:`__setitem__`, :meth:`__delitem__` and either :meth:`__iter__` or :meth:`__contains__`. >>> class Container: diff --git a/Doc/library/wsgiref.rst b/Doc/library/wsgiref.rst index 39a4c1ba142338..be9e56b04c1fbf 100644 --- a/Doc/library/wsgiref.rst +++ b/Doc/library/wsgiref.rst @@ -180,7 +180,7 @@ also provides these miscellaneous utilities: print(chunk) .. versionchanged:: 3.11 - Support for :meth:`__getitem__` method has been removed. + Support for :meth:`~object.__getitem__` method has been removed. :mod:`wsgiref.headers` -- WSGI response header tools @@ -201,7 +201,7 @@ manipulation of WSGI response headers using a mapping-like interface. an empty list. :class:`Headers` objects support typical mapping operations including - :meth:`__getitem__`, :meth:`get`, :meth:`__setitem__`, :meth:`setdefault`, + :meth:`~object.__getitem__`, :meth:`get`, :meth:`__setitem__`, :meth:`setdefault`, :meth:`__delitem__` and :meth:`__contains__`. For each of these methods, the key is the header name (treated case-insensitively), and the value is the first value associated with that header name. Setting a header diff --git a/Doc/library/xml.dom.pulldom.rst b/Doc/library/xml.dom.pulldom.rst index d1df465a598e53..843c2fd7fdb937 100644 --- a/Doc/library/xml.dom.pulldom.rst +++ b/Doc/library/xml.dom.pulldom.rst @@ -115,7 +115,7 @@ DOMEventStream Objects .. class:: DOMEventStream(stream, parser, bufsize) .. versionchanged:: 3.11 - Support for :meth:`__getitem__` method has been removed. + Support for :meth:`~object.__getitem__` method has been removed. .. method:: getEvent() diff --git a/Doc/reference/compound_stmts.rst b/Doc/reference/compound_stmts.rst index 19df7f63f12536..4efe30dbd25b83 100644 --- a/Doc/reference/compound_stmts.rst +++ b/Doc/reference/compound_stmts.rst @@ -1060,7 +1060,7 @@ subject value: .. note:: Key-value pairs are matched using the two-argument form of the mapping subject's ``get()`` method. Matched key-value pairs must already be present in the mapping, and not created on-the-fly via :meth:`__missing__` or - :meth:`__getitem__`. + :meth:`~object.__getitem__`. In simple terms ``{KEY1: P1, KEY2: P2, ... }`` matches only if all the following happens: diff --git a/Doc/reference/expressions.rst b/Doc/reference/expressions.rst index 3c4706714f306c..5cba4a6934f05b 100644 --- a/Doc/reference/expressions.rst +++ b/Doc/reference/expressions.rst @@ -872,7 +872,7 @@ to the index so that, for example, ``x[-1]`` selects the last item of ``x``. The resulting value must be a nonnegative integer less than the number of items in the sequence, and the subscription selects the item whose index is that value (counting from zero). Since the support for negative indices and slicing -occurs in the object's :meth:`__getitem__` method, subclasses overriding +occurs in the object's :meth:`~object.__getitem__` method, subclasses overriding this method will need to explicitly add that support. .. index:: @@ -927,7 +927,7 @@ slice list contains no proper slice). single: step (slice object attribute) The semantics for a slicing are as follows. The primary is indexed (using the -same :meth:`__getitem__` method as +same :meth:`~object.__getitem__` method as normal subscription) with a key that is constructed from the slice list, as follows. If the slice list contains at least one comma, the key is a tuple containing the conversion of the slice items; otherwise, the conversion of the @@ -1653,7 +1653,7 @@ If an exception is raised during the iteration, it is as if :keyword:`in` raised that exception. Lastly, the old-style iteration protocol is tried: if a class defines -:meth:`__getitem__`, ``x in y`` is ``True`` if and only if there is a non-negative +:meth:`~object.__getitem__`, ``x in y`` is ``True`` if and only if there is a non-negative integer index *i* such that ``x is y[i] or x == y[i]``, and no lower integer index raises the :exc:`IndexError` exception. (If any other exception is raised, it is as if :keyword:`in` raised that exception). diff --git a/Doc/whatsnew/2.2.rst b/Doc/whatsnew/2.2.rst index d9ead57413cbbf..6dfe79cef00987 100644 --- a/Doc/whatsnew/2.2.rst +++ b/Doc/whatsnew/2.2.rst @@ -424,22 +424,22 @@ Another significant addition to 2.2 is an iteration interface at both the C and Python levels. Objects can define how they can be looped over by callers. In Python versions up to 2.1, the usual way to make ``for item in obj`` work is -to define a :meth:`__getitem__` method that looks something like this:: +to define a :meth:`~object.__getitem__` method that looks something like this:: def __getitem__(self, index): return -:meth:`__getitem__` is more properly used to define an indexing operation on an +:meth:`~object.__getitem__` is more properly used to define an indexing operation on an object so that you can write ``obj[5]`` to retrieve the sixth element. It's a bit misleading when you're using this only to support :keyword:`for` loops. Consider some file-like object that wants to be looped over; the *index* parameter is essentially meaningless, as the class probably assumes that a -series of :meth:`__getitem__` calls will be made with *index* incrementing by -one each time. In other words, the presence of the :meth:`__getitem__` method +series of :meth:`~object.__getitem__` calls will be made with *index* incrementing by +one each time. In other words, the presence of the :meth:`~object.__getitem__` method doesn't mean that using ``file[5]`` to randomly access the sixth element will work, though it really should. -In Python 2.2, iteration can be implemented separately, and :meth:`__getitem__` +In Python 2.2, iteration can be implemented separately, and :meth:`~object.__getitem__` methods can be limited to classes that really do support random access. The basic idea of iterators is simple. A new built-in function, ``iter(obj)`` or ``iter(C, sentinel)``, is used to get an iterator. ``iter(obj)`` returns diff --git a/Doc/whatsnew/2.3.rst b/Doc/whatsnew/2.3.rst index 3460421fe45cce..8a4a9c412cf259 100644 --- a/Doc/whatsnew/2.3.rst +++ b/Doc/whatsnew/2.3.rst @@ -925,7 +925,7 @@ Deletion is more straightforward:: >>> a [1, 3] -One can also now pass slice objects to the :meth:`__getitem__` methods of the +One can also now pass slice objects to the :meth:`~object.__getitem__` methods of the built-in sequences:: >>> range(10).__getitem__(slice(0, 5, 2)) @@ -1596,7 +1596,7 @@ complete list of changes, or look through the CVS logs for all the details. module. Adding the mix-in as a superclass provides the full dictionary interface - whenever the class defines :meth:`__getitem__`, :meth:`__setitem__`, + whenever the class defines :meth:`~object.__getitem__`, :meth:`__setitem__`, :meth:`__delitem__`, and :meth:`keys`. For example:: >>> import UserDict diff --git a/Doc/whatsnew/3.8.rst b/Doc/whatsnew/3.8.rst index 3789164cc1f72e..0a402bd0176004 100644 --- a/Doc/whatsnew/3.8.rst +++ b/Doc/whatsnew/3.8.rst @@ -1650,7 +1650,7 @@ Deprecated deprecated and will be prohibited in Python 3.9. (Contributed by Elvis Pranskevichus in :issue:`34075`.) -* The :meth:`__getitem__` methods of :class:`xml.dom.pulldom.DOMEventStream`, +* The :meth:`~object.__getitem__` methods of :class:`xml.dom.pulldom.DOMEventStream`, :class:`wsgiref.util.FileWrapper` and :class:`fileinput.FileInput` have been deprecated. diff --git a/Misc/NEWS.d/3.11.0a1.rst b/Misc/NEWS.d/3.11.0a1.rst index 46c65300abf2c2..38f1e15f5b1087 100644 --- a/Misc/NEWS.d/3.11.0a1.rst +++ b/Misc/NEWS.d/3.11.0a1.rst @@ -1720,7 +1720,7 @@ Improve the speed and accuracy of statistics.pvariance(). .. nonce: WI9zQY .. section: Library -Remove :meth:`__getitem__` methods of +Remove :meth:`~object.__getitem__` methods of :class:`xml.dom.pulldom.DOMEventStream`, :class:`wsgiref.util.FileWrapper` and :class:`fileinput.FileInput`, deprecated since Python 3.9. diff --git a/Misc/NEWS.d/3.8.0a1.rst b/Misc/NEWS.d/3.8.0a1.rst index 4adacfd41809d5..104e6649f24120 100644 --- a/Misc/NEWS.d/3.8.0a1.rst +++ b/Misc/NEWS.d/3.8.0a1.rst @@ -3592,7 +3592,7 @@ Python 3.5. .. nonce: V8Ou3K .. section: Library -Deprecate :meth:`__getitem__` methods of +Deprecate :meth:`~object.__getitem__` methods of :class:`xml.dom.pulldom.DOMEventStream`, :class:`wsgiref.util.FileWrapper` and :class:`fileinput.FileInput`. From f22cdecb1950d87154f15b9a67819ecc472ff510 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 19 Oct 2023 17:34:27 +0200 Subject: [PATCH 104/265] [3.11] GH-101100: Fix reference warnings for ``__enter__`` and ``__exit__`` (GH-110112) (#111076) Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com> --- Doc/glossary.rst | 2 +- Doc/library/contextlib.rst | 22 +++++++++++----------- Doc/library/stdtypes.rst | 2 +- Doc/library/test.rst | 2 +- Doc/library/unittest.mock.rst | 4 ++-- Doc/reference/compound_stmts.rst | 20 ++++++++++---------- Doc/reference/datamodel.rst | 6 +++--- Doc/whatsnew/2.5.rst | 22 +++++++++++----------- Doc/whatsnew/2.6.rst | 26 +++++++++++++------------- Doc/whatsnew/2.7.rst | 8 ++++---- Misc/NEWS.d/3.10.0a7.rst | 2 +- 11 files changed, 58 insertions(+), 58 deletions(-) diff --git a/Doc/glossary.rst b/Doc/glossary.rst index 7c91a0a8184e6b..f318a37b654e9d 100644 --- a/Doc/glossary.rst +++ b/Doc/glossary.rst @@ -248,7 +248,7 @@ Glossary context manager An object which controls the environment seen in a :keyword:`with` - statement by defining :meth:`__enter__` and :meth:`__exit__` methods. + statement by defining :meth:`~object.__enter__` and :meth:`~object.__exit__` methods. See :pep:`343`. context variable diff --git a/Doc/library/contextlib.rst b/Doc/library/contextlib.rst index 1b55868c3aa62f..80a683d4756fe8 100644 --- a/Doc/library/contextlib.rst +++ b/Doc/library/contextlib.rst @@ -45,7 +45,7 @@ Functions and classes provided: This function is a :term:`decorator` that can be used to define a factory function for :keyword:`with` statement context managers, without needing to - create a class or separate :meth:`__enter__` and :meth:`__exit__` methods. + create a class or separate :meth:`~object.__enter__` and :meth:`~object.__exit__` methods. While many objects natively support use in with statements, sometimes a resource needs to be managed that isn't a context manager in its own right, @@ -508,7 +508,7 @@ Functions and classes provided: # the with statement, even if attempts to open files later # in the list raise an exception - The :meth:`__enter__` method returns the :class:`ExitStack` instance, and + The :meth:`~object.__enter__` method returns the :class:`ExitStack` instance, and performs no additional operations. Each instance maintains a stack of registered callbacks that are called in @@ -536,9 +536,9 @@ Functions and classes provided: .. method:: enter_context(cm) - Enters a new context manager and adds its :meth:`__exit__` method to + Enters a new context manager and adds its :meth:`~object.__exit__` method to the callback stack. The return value is the result of the context - manager's own :meth:`__enter__` method. + manager's own :meth:`~object.__enter__` method. These context managers may suppress exceptions just as they normally would if used directly as part of a :keyword:`with` statement. @@ -549,18 +549,18 @@ Functions and classes provided: .. method:: push(exit) - Adds a context manager's :meth:`__exit__` method to the callback stack. + Adds a context manager's :meth:`~object.__exit__` method to the callback stack. As ``__enter__`` is *not* invoked, this method can be used to cover - part of an :meth:`__enter__` implementation with a context manager's own - :meth:`__exit__` method. + part of an :meth:`~object.__enter__` implementation with a context manager's own + :meth:`~object.__exit__` method. If passed an object that is not a context manager, this method assumes it is a callback with the same signature as a context manager's - :meth:`__exit__` method and adds it directly to the callback stack. + :meth:`~object.__exit__` method and adds it directly to the callback stack. By returning true values, these callbacks can suppress exceptions the - same way context manager :meth:`__exit__` methods can. + same way context manager :meth:`~object.__exit__` methods can. The passed in object is returned from the function, allowing this method to be used as a function decorator. @@ -707,7 +707,7 @@ Cleaning up in an ``__enter__`` implementation As noted in the documentation of :meth:`ExitStack.push`, this method can be useful in cleaning up an already allocated resource if later -steps in the :meth:`__enter__` implementation fail. +steps in the :meth:`~object.__enter__` implementation fail. Here's an example of doing this for a context manager that accepts resource acquisition and release functions, along with an optional validation function, @@ -864,7 +864,7 @@ And also as a function decorator:: Note that there is one additional limitation when using context managers as function decorators: there's no way to access the return value of -:meth:`__enter__`. If that value is needed, then it is still necessary to use +:meth:`~object.__enter__`. If that value is needed, then it is still necessary to use an explicit ``with`` statement. .. seealso:: diff --git a/Doc/library/stdtypes.rst b/Doc/library/stdtypes.rst index ad3135cbc146ed..fd3218a89c7cd2 100644 --- a/Doc/library/stdtypes.rst +++ b/Doc/library/stdtypes.rst @@ -4802,7 +4802,7 @@ before the statement body is executed and exited when the statement ends: The exception passed in should never be reraised explicitly - instead, this method should return a false value to indicate that the method completed successfully and does not want to suppress the raised exception. This allows - context management code to easily detect whether or not an :meth:`__exit__` + context management code to easily detect whether or not an :meth:`~object.__exit__` method has actually failed. Python defines several context managers to support easy thread synchronisation, diff --git a/Doc/library/test.rst b/Doc/library/test.rst index d3eb0ae00a08dc..aa67a9480ddfcb 100644 --- a/Doc/library/test.rst +++ b/Doc/library/test.rst @@ -1021,7 +1021,7 @@ The :mod:`test.support` module defines the following classes: :const:`resource.RLIMIT_CORE`'s soft limit to 0 to prevent coredump file creation. - On both platforms, the old value is restored by :meth:`__exit__`. + On both platforms, the old value is restored by :meth:`~object.__exit__`. .. class:: SaveSignals() diff --git a/Doc/library/unittest.mock.rst b/Doc/library/unittest.mock.rst index dee5c9e88a0a44..b7e03dfd642b39 100644 --- a/Doc/library/unittest.mock.rst +++ b/Doc/library/unittest.mock.rst @@ -2480,8 +2480,8 @@ are closed properly and is becoming common:: f.write('something') The issue is that even if you mock out the call to :func:`open` it is the -*returned object* that is used as a context manager (and has :meth:`__enter__` and -:meth:`__exit__` called). +*returned object* that is used as a context manager (and has :meth:`~object.__enter__` and +:meth:`~object.__exit__` called). Mocking context managers with a :class:`MagicMock` is common enough and fiddly enough that a helper function is useful. :: diff --git a/Doc/reference/compound_stmts.rst b/Doc/reference/compound_stmts.rst index 4efe30dbd25b83..fd3091105d2273 100644 --- a/Doc/reference/compound_stmts.rst +++ b/Doc/reference/compound_stmts.rst @@ -491,37 +491,37 @@ The execution of the :keyword:`with` statement with one "item" proceeds as follo #. The context expression (the expression given in the :token:`~python-grammar:with_item`) is evaluated to obtain a context manager. -#. The context manager's :meth:`__enter__` is loaded for later use. +#. The context manager's :meth:`~object.__enter__` is loaded for later use. -#. The context manager's :meth:`__exit__` is loaded for later use. +#. The context manager's :meth:`~object.__exit__` is loaded for later use. -#. The context manager's :meth:`__enter__` method is invoked. +#. The context manager's :meth:`~object.__enter__` method is invoked. #. If a target was included in the :keyword:`with` statement, the return value - from :meth:`__enter__` is assigned to it. + from :meth:`~object.__enter__` is assigned to it. .. note:: - The :keyword:`with` statement guarantees that if the :meth:`__enter__` - method returns without an error, then :meth:`__exit__` will always be + The :keyword:`with` statement guarantees that if the :meth:`~object.__enter__` + method returns without an error, then :meth:`~object.__exit__` will always be called. Thus, if an error occurs during the assignment to the target list, it will be treated the same as an error occurring within the suite would be. See step 7 below. #. The suite is executed. -#. The context manager's :meth:`__exit__` method is invoked. If an exception +#. The context manager's :meth:`~object.__exit__` method is invoked. If an exception caused the suite to be exited, its type, value, and traceback are passed as - arguments to :meth:`__exit__`. Otherwise, three :const:`None` arguments are + arguments to :meth:`~object.__exit__`. Otherwise, three :const:`None` arguments are supplied. If the suite was exited due to an exception, and the return value from the - :meth:`__exit__` method was false, the exception is reraised. If the return + :meth:`~object.__exit__` method was false, the exception is reraised. If the return value was true, the exception is suppressed, and execution continues with the statement following the :keyword:`with` statement. If the suite was exited for any reason other than an exception, the return - value from :meth:`__exit__` is ignored, and execution proceeds at the normal + value from :meth:`~object.__exit__` is ignored, and execution proceeds at the normal location for the kind of exit that was taken. The following code:: diff --git a/Doc/reference/datamodel.rst b/Doc/reference/datamodel.rst index 27ac00216a01df..ce92a120609046 100644 --- a/Doc/reference/datamodel.rst +++ b/Doc/reference/datamodel.rst @@ -2920,7 +2920,7 @@ For more information on context managers, see :ref:`typecontextmanager`. (i.e., prevent it from being propagated), it should return a true value. Otherwise, the exception will be processed normally upon exit from this method. - Note that :meth:`__exit__` methods should not reraise the passed-in exception; + Note that :meth:`~object.__exit__` methods should not reraise the passed-in exception; this is the caller's responsibility. @@ -3192,12 +3192,12 @@ Asynchronous context managers can be used in an :keyword:`async with` statement. .. method:: object.__aenter__(self) - Semantically similar to :meth:`__enter__`, the only + Semantically similar to :meth:`~object.__enter__`, the only difference being that it must return an *awaitable*. .. method:: object.__aexit__(self, exc_type, exc_value, traceback) - Semantically similar to :meth:`__exit__`, the only + Semantically similar to :meth:`~object.__exit__`, the only difference being that it must return an *awaitable*. An example of an asynchronous context manager class:: diff --git a/Doc/whatsnew/2.5.rst b/Doc/whatsnew/2.5.rst index f58b3ede27facc..ad0931ecbed060 100644 --- a/Doc/whatsnew/2.5.rst +++ b/Doc/whatsnew/2.5.rst @@ -575,15 +575,15 @@ structure is:: with-block The expression is evaluated, and it should result in an object that supports the -context management protocol (that is, has :meth:`__enter__` and :meth:`__exit__` +context management protocol (that is, has :meth:`~object.__enter__` and :meth:`~object.__exit__` methods. -The object's :meth:`__enter__` is called before *with-block* is executed and +The object's :meth:`~object.__enter__` is called before *with-block* is executed and therefore can run set-up code. It also may return a value that is bound to the name *variable*, if given. (Note carefully that *variable* is *not* assigned the result of *expression*.) -After execution of the *with-block* is finished, the object's :meth:`__exit__` +After execution of the *with-block* is finished, the object's :meth:`~object.__exit__` method is called, even if the block raised an exception, and can therefore run clean-up code. @@ -609,7 +609,7 @@ part-way through the block. .. note:: In this case, *f* is the same object created by :func:`open`, because - :meth:`file.__enter__` returns *self*. + :meth:`~object.__enter__` returns *self*. The :mod:`threading` module's locks and condition variables also support the ':keyword:`with`' statement:: @@ -652,10 +652,10 @@ underlying implementation and should keep reading. A high-level explanation of the context management protocol is: * The expression is evaluated and should result in an object called a "context - manager". The context manager must have :meth:`__enter__` and :meth:`__exit__` + manager". The context manager must have :meth:`~object.__enter__` and :meth:`~object.__exit__` methods. -* The context manager's :meth:`__enter__` method is called. The value returned +* The context manager's :meth:`~object.__enter__` method is called. The value returned is assigned to *VAR*. If no ``'as VAR'`` clause is present, the value is simply discarded. @@ -669,7 +669,7 @@ A high-level explanation of the context management protocol is: if you do the author of the code containing the ':keyword:`with`' statement will never realize anything went wrong. -* If *BLOCK* didn't raise an exception, the :meth:`__exit__` method is still +* If *BLOCK* didn't raise an exception, the :meth:`~object.__exit__` method is still called, but *type*, *value*, and *traceback* are all ``None``. Let's think through an example. I won't present detailed code but will only @@ -703,7 +703,7 @@ rolled back if there's an exception. Here's the basic interface for def rollback (self): "Rolls back current transaction" -The :meth:`__enter__` method is pretty easy, having only to start a new +The :meth:`~object.__enter__` method is pretty easy, having only to start a new transaction. For this application the resulting cursor object would be a useful result, so the method will return it. The user can then add ``as cursor`` to their ':keyword:`with`' statement to bind the cursor to a variable name. :: @@ -715,7 +715,7 @@ their ':keyword:`with`' statement to bind the cursor to a variable name. :: cursor = self.cursor() return cursor -The :meth:`__exit__` method is the most complicated because it's where most of +The :meth:`~object.__exit__` method is the most complicated because it's where most of the work has to be done. The method has to check if an exception occurred. If there was no exception, the transaction is committed. The transaction is rolled back if there was an exception. @@ -748,10 +748,10 @@ are useful for writing objects for use with the ':keyword:`with`' statement. The decorator is called :func:`contextmanager`, and lets you write a single generator function instead of defining a new class. The generator should yield exactly one value. The code up to the :keyword:`yield` will be executed as the -:meth:`__enter__` method, and the value yielded will be the method's return +:meth:`~object.__enter__` method, and the value yielded will be the method's return value that will get bound to the variable in the ':keyword:`with`' statement's :keyword:`!as` clause, if any. The code after the :keyword:`yield` will be -executed in the :meth:`__exit__` method. Any exception raised in the block will +executed in the :meth:`~object.__exit__` method. Any exception raised in the block will be raised by the :keyword:`!yield` statement. Our database example from the previous section could be written using this diff --git a/Doc/whatsnew/2.6.rst b/Doc/whatsnew/2.6.rst index 759a0757ebefa7..2b7ef2cd4fe4a9 100644 --- a/Doc/whatsnew/2.6.rst +++ b/Doc/whatsnew/2.6.rst @@ -269,15 +269,15 @@ structure is:: with-block The expression is evaluated, and it should result in an object that supports the -context management protocol (that is, has :meth:`__enter__` and :meth:`__exit__` +context management protocol (that is, has :meth:`~object.__enter__` and :meth:`~object.__exit__` methods). -The object's :meth:`__enter__` is called before *with-block* is executed and +The object's :meth:`~object.__enter__` is called before *with-block* is executed and therefore can run set-up code. It also may return a value that is bound to the name *variable*, if given. (Note carefully that *variable* is *not* assigned the result of *expression*.) -After execution of the *with-block* is finished, the object's :meth:`__exit__` +After execution of the *with-block* is finished, the object's :meth:`~object.__exit__` method is called, even if the block raised an exception, and can therefore run clean-up code. @@ -296,7 +296,7 @@ part-way through the block. .. note:: In this case, *f* is the same object created by :func:`open`, because - :meth:`file.__enter__` returns *self*. + :meth:`~object.__enter__` returns *self*. The :mod:`threading` module's locks and condition variables also support the ':keyword:`with`' statement:: @@ -339,16 +339,16 @@ underlying implementation and should keep reading. A high-level explanation of the context management protocol is: * The expression is evaluated and should result in an object called a "context - manager". The context manager must have :meth:`__enter__` and :meth:`__exit__` + manager". The context manager must have :meth:`~object.__enter__` and :meth:`~object.__exit__` methods. -* The context manager's :meth:`__enter__` method is called. The value returned +* The context manager's :meth:`~object.__enter__` method is called. The value returned is assigned to *VAR*. If no ``as VAR`` clause is present, the value is simply discarded. * The code in *BLOCK* is executed. -* If *BLOCK* raises an exception, the context manager's :meth:`__exit__` method +* If *BLOCK* raises an exception, the context manager's :meth:`~object.__exit__` method is called with three arguments, the exception details (``type, value, traceback``, the same values returned by :func:`sys.exc_info`, which can also be ``None`` if no exception occurred). The method's return value controls whether an exception @@ -357,7 +357,7 @@ A high-level explanation of the context management protocol is: if you do the author of the code containing the ':keyword:`with`' statement will never realize anything went wrong. -* If *BLOCK* didn't raise an exception, the :meth:`__exit__` method is still +* If *BLOCK* didn't raise an exception, the :meth:`~object.__exit__` method is still called, but *type*, *value*, and *traceback* are all ``None``. Let's think through an example. I won't present detailed code but will only @@ -391,7 +391,7 @@ rolled back if there's an exception. Here's the basic interface for def rollback(self): "Rolls back current transaction" -The :meth:`__enter__` method is pretty easy, having only to start a new +The :meth:`~object.__enter__` method is pretty easy, having only to start a new transaction. For this application the resulting cursor object would be a useful result, so the method will return it. The user can then add ``as cursor`` to their ':keyword:`with`' statement to bind the cursor to a variable name. :: @@ -403,7 +403,7 @@ their ':keyword:`with`' statement to bind the cursor to a variable name. :: cursor = self.cursor() return cursor -The :meth:`__exit__` method is the most complicated because it's where most of +The :meth:`~object.__exit__` method is the most complicated because it's where most of the work has to be done. The method has to check if an exception occurred. If there was no exception, the transaction is committed. The transaction is rolled back if there was an exception. @@ -436,10 +436,10 @@ are useful when writing objects for use with the ':keyword:`with`' statement. The decorator is called :func:`contextmanager`, and lets you write a single generator function instead of defining a new class. The generator should yield exactly one value. The code up to the :keyword:`yield` will be executed as the -:meth:`__enter__` method, and the value yielded will be the method's return +:meth:`~object.__enter__` method, and the value yielded will be the method's return value that will get bound to the variable in the ':keyword:`with`' statement's :keyword:`!as` clause, if any. The code after the :keyword:`!yield` will be -executed in the :meth:`__exit__` method. Any exception raised in the block will +executed in the :meth:`~object.__exit__` method. Any exception raised in the block will be raised by the :keyword:`!yield` statement. Using this decorator, our database example from the previous section @@ -1737,7 +1737,7 @@ Optimizations (Contributed by Antoine Pitrou.) Memory usage is reduced by using pymalloc for the Unicode string's data. -* The ``with`` statement now stores the :meth:`__exit__` method on the stack, +* The ``with`` statement now stores the :meth:`~object.__exit__` method on the stack, producing a small speedup. (Implemented by Jeffrey Yasskin.) * To reduce memory usage, the garbage collector will now clear internal diff --git a/Doc/whatsnew/2.7.rst b/Doc/whatsnew/2.7.rst index caf28ff50b3aa2..5979d77da5d6b6 100644 --- a/Doc/whatsnew/2.7.rst +++ b/Doc/whatsnew/2.7.rst @@ -930,8 +930,8 @@ Optimizations Several performance enhancements have been added: * A new opcode was added to perform the initial setup for - :keyword:`with` statements, looking up the :meth:`__enter__` and - :meth:`__exit__` methods. (Contributed by Benjamin Peterson.) + :keyword:`with` statements, looking up the :meth:`~object.__enter__` and + :meth:`~object.__exit__` methods. (Contributed by Benjamin Peterson.) * The garbage collector now performs better for one common usage pattern: when many objects are being allocated without deallocating @@ -2448,13 +2448,13 @@ that may require changes to your code: (Changed by Eric Smith; :issue:`5920`.) * Because of an optimization for the :keyword:`with` statement, the special - methods :meth:`__enter__` and :meth:`__exit__` must belong to the object's + methods :meth:`~object.__enter__` and :meth:`~object.__exit__` must belong to the object's type, and cannot be directly attached to the object's instance. This affects new-style classes (derived from :class:`object`) and C extension types. (:issue:`6101`.) * Due to a bug in Python 2.6, the *exc_value* parameter to - :meth:`__exit__` methods was often the string representation of the + :meth:`~object.__exit__` methods was often the string representation of the exception, not an instance. This was fixed in 2.7, so *exc_value* will be an instance as expected. (Fixed by Florent Xicluna; :issue:`7853`.) diff --git a/Misc/NEWS.d/3.10.0a7.rst b/Misc/NEWS.d/3.10.0a7.rst index 3a1694f444616a..d9cdfbd04c88d4 100644 --- a/Misc/NEWS.d/3.10.0a7.rst +++ b/Misc/NEWS.d/3.10.0a7.rst @@ -274,7 +274,7 @@ Co-authored-by: Tim Peters Only handle asynchronous exceptions and requests to drop the GIL when returning from a call or on the back edges of loops. Makes sure that -:meth:`__exit__` is always called in with statements, even for interrupts. +:meth:`~object.__exit__` is always called in with statements, even for interrupts. .. From 2b3f9a5a1dc6a4e335bdcd4454b298007f7fad4a Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 19 Oct 2023 17:35:07 +0200 Subject: [PATCH 105/265] [3.11] gh-109510: Clearly explain "Which Docstrings Are Examined" (GH-109696) (#111078) Co-authored-by: Unique-Usman <86585626+Unique-Usman@users.noreply.github.com> Co-authored-by: Mariatta Co-authored-by: Jacob Coffee Co-authored-by: Hugo van Kemenade Co-authored-by: C.A.M. Gerlach --- Doc/library/doctest.rst | 27 ++++++++++++++++++++++++--- 1 file changed, 24 insertions(+), 3 deletions(-) diff --git a/Doc/library/doctest.rst b/Doc/library/doctest.rst index c106d5a3383a5e..d00c4c88d57176 100644 --- a/Doc/library/doctest.rst +++ b/Doc/library/doctest.rst @@ -277,13 +277,34 @@ Which Docstrings Are Examined? The module docstring, and all function, class and method docstrings are searched. Objects imported into the module are not searched. -In addition, if ``M.__test__`` exists and "is true", it must be a dict, and each +In addition, there are cases when you want tests to be part of a module but not part +of the help text, which requires that the tests not be included in the docstring. +Doctest looks for a module-level variable called ``__test__`` and uses it to locate other +tests. If ``M.__test__`` exists and is truthy, it must be a dict, and each entry maps a (string) name to a function object, class object, or string. Function and class object docstrings found from ``M.__test__`` are searched, and strings are treated as if they were docstrings. In output, a key ``K`` in -``M.__test__`` appears with name :: +``M.__test__`` appears with name ``M.__test__.K``. - .__test__.K +For example, place this block of code at the top of :file:`example.py`: + +.. code-block:: python + + __test__ = { + 'numbers': """ + >>> factorial(6) + 720 + + >>> [factorial(n) for n in range(6)] + [1, 1, 2, 6, 24, 120] + """ + } + +The value of ``example.__test__["numbers"]`` will be treated as a +docstring and all the tests inside it will be run. It is +important to note that the value can be mapped to a function, +class object, or module; if so, :mod:`!doctest` +searches them recursively for docstrings, which are then scanned for tests. Any classes found are recursively searched similarly, to test docstrings in their contained methods and nested classes. From 69f0c900114d32023d663b7448417e7a252e80eb Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 20 Oct 2023 06:25:17 +0200 Subject: [PATCH 106/265] [3.11] gh-111092: Make turtledemo run without default root enabled (GH-111093) (#111096) gh-111092: Make turtledemo run without default root enabled (GH-111093) Add missing 'root' argument to PanedWindow call. Other root children already have it. (cherry picked from commit b802882fb2bff8b431df661322908c07491f3ce7) Co-authored-by: Terry Jan Reedy --- Lib/turtledemo/__main__.py | 2 +- .../next/Library/2023-10-19-22-46-34.gh-issue-111092.hgut12.rst | 1 + 2 files changed, 2 insertions(+), 1 deletion(-) create mode 100644 Misc/NEWS.d/next/Library/2023-10-19-22-46-34.gh-issue-111092.hgut12.rst diff --git a/Lib/turtledemo/__main__.py b/Lib/turtledemo/__main__.py index f6c9d6aa6f9a32..2ab6c15e2c079e 100755 --- a/Lib/turtledemo/__main__.py +++ b/Lib/turtledemo/__main__.py @@ -161,7 +161,7 @@ def __init__(self, filename=None): label='Help', underline=0) root['menu'] = self.mBar - pane = PanedWindow(orient=HORIZONTAL, sashwidth=5, + pane = PanedWindow(root, orient=HORIZONTAL, sashwidth=5, sashrelief=SOLID, bg='#ddd') pane.add(self.makeTextFrame(pane)) pane.add(self.makeGraphFrame(pane)) diff --git a/Misc/NEWS.d/next/Library/2023-10-19-22-46-34.gh-issue-111092.hgut12.rst b/Misc/NEWS.d/next/Library/2023-10-19-22-46-34.gh-issue-111092.hgut12.rst new file mode 100644 index 00000000000000..487bd177d27e31 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-10-19-22-46-34.gh-issue-111092.hgut12.rst @@ -0,0 +1 @@ +Make turtledemo run without default root enabled. From f1dbde0d3a448e623bd8c4c76139284174e0f90d Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 20 Oct 2023 06:50:17 +0200 Subject: [PATCH 107/265] [3.11] gh-101100: Fix Sphinx warnings in `library/tty.rst` (GH-111079) (#111098) gh-101100: Fix Sphinx warnings in `library/tty.rst` (GH-111079) Fix Sphinx warnings in library/tty.rst (cherry picked from commit c42c68aa7bd19b0de7f2132ed468bc4ce83d8aa9) Co-authored-by: Hugo van Kemenade --- Doc/library/termios.rst | 18 ++++++++++++++---- Doc/tools/.nitignore | 1 - 2 files changed, 14 insertions(+), 5 deletions(-) diff --git a/Doc/library/termios.rst b/Doc/library/termios.rst index fb1ff567d49e5c..03806178e9d3fb 100644 --- a/Doc/library/termios.rst +++ b/Doc/library/termios.rst @@ -43,10 +43,20 @@ The module defines the following functions: Set the tty attributes for file descriptor *fd* from the *attributes*, which is a list like the one returned by :func:`tcgetattr`. The *when* argument - determines when the attributes are changed: :const:`TCSANOW` to change - immediately, :const:`TCSADRAIN` to change after transmitting all queued output, - or :const:`TCSAFLUSH` to change after transmitting all queued output and - discarding all queued input. + determines when the attributes are changed: + + .. data:: TCSANOW + + Change attributes immediately. + + .. data:: TCSADRAIN + + Change attributes after transmitting all queued output. + + .. data:: TCSAFLUSH + + Change attributes after transmitting all queued output and + discarding all queued input. .. function:: tcsendbreak(fd, duration) diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index b47924f25348fa..b49fce00cf4886 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -124,7 +124,6 @@ Doc/library/tkinter.rst Doc/library/tkinter.scrolledtext.rst Doc/library/tkinter.ttk.rst Doc/library/traceback.rst -Doc/library/tty.rst Doc/library/unittest.mock.rst Doc/library/unittest.rst Doc/library/urllib.parse.rst From 6df935c2128f59c327fe29ff8c114a2bc644471a Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 20 Oct 2023 10:09:23 +0200 Subject: [PATCH 108/265] [3.11] gh-101100: Fix sphinx warnings in `library/codecs.rst` (GH-110979) (#111071) Co-authored-by: Nikita Sobolev Co-authored-by: Hugo van Kemenade --- Doc/library/codecs.rst | 71 ++++++++++++++++++++++-------------------- Doc/tools/.nitignore | 1 - 2 files changed, 38 insertions(+), 34 deletions(-) diff --git a/Doc/library/codecs.rst b/Doc/library/codecs.rst index 2db4a67d1973d5..9ce584874783da 100644 --- a/Doc/library/codecs.rst +++ b/Doc/library/codecs.rst @@ -520,44 +520,46 @@ The base :class:`Codec` class defines these methods which also define the function interfaces of the stateless encoder and decoder: -.. method:: Codec.encode(input, errors='strict') +.. class:: Codec - Encodes the object *input* and returns a tuple (output object, length consumed). - For instance, :term:`text encoding` converts - a string object to a bytes object using a particular - character set encoding (e.g., ``cp1252`` or ``iso-8859-1``). + .. method:: encode(input, errors='strict') - The *errors* argument defines the error handling to apply. - It defaults to ``'strict'`` handling. + Encodes the object *input* and returns a tuple (output object, length consumed). + For instance, :term:`text encoding` converts + a string object to a bytes object using a particular + character set encoding (e.g., ``cp1252`` or ``iso-8859-1``). - The method may not store state in the :class:`Codec` instance. Use - :class:`StreamWriter` for codecs which have to keep state in order to make - encoding efficient. + The *errors* argument defines the error handling to apply. + It defaults to ``'strict'`` handling. - The encoder must be able to handle zero length input and return an empty object - of the output object type in this situation. + The method may not store state in the :class:`Codec` instance. Use + :class:`StreamWriter` for codecs which have to keep state in order to make + encoding efficient. + The encoder must be able to handle zero length input and return an empty object + of the output object type in this situation. -.. method:: Codec.decode(input, errors='strict') - Decodes the object *input* and returns a tuple (output object, length - consumed). For instance, for a :term:`text encoding`, decoding converts - a bytes object encoded using a particular - character set encoding to a string object. + .. method:: decode(input, errors='strict') - For text encodings and bytes-to-bytes codecs, - *input* must be a bytes object or one which provides the read-only - buffer interface -- for example, buffer objects and memory mapped files. + Decodes the object *input* and returns a tuple (output object, length + consumed). For instance, for a :term:`text encoding`, decoding converts + a bytes object encoded using a particular + character set encoding to a string object. - The *errors* argument defines the error handling to apply. - It defaults to ``'strict'`` handling. + For text encodings and bytes-to-bytes codecs, + *input* must be a bytes object or one which provides the read-only + buffer interface -- for example, buffer objects and memory mapped files. - The method may not store state in the :class:`Codec` instance. Use - :class:`StreamReader` for codecs which have to keep state in order to make - decoding efficient. + The *errors* argument defines the error handling to apply. + It defaults to ``'strict'`` handling. - The decoder must be able to handle zero length input and return an empty object - of the output object type in this situation. + The method may not store state in the :class:`Codec` instance. Use + :class:`StreamReader` for codecs which have to keep state in order to make + decoding efficient. + + The decoder must be able to handle zero length input and return an empty object + of the output object type in this situation. Incremental Encoding and Decoding @@ -705,7 +707,7 @@ Stream Encoding and Decoding The :class:`StreamWriter` and :class:`StreamReader` classes provide generic working interfaces which can be used to implement new encoding submodules very -easily. See :mod:`encodings.utf_8` for an example of how this is done. +easily. See :mod:`!encodings.utf_8` for an example of how this is done. .. _stream-writer-objects: @@ -895,9 +897,10 @@ The design is such that one can use the factory functions returned by the .. class:: StreamRecoder(stream, encode, decode, Reader, Writer, errors='strict') Creates a :class:`StreamRecoder` instance which implements a two-way conversion: - *encode* and *decode* work on the frontend — the data visible to - code calling :meth:`read` and :meth:`write`, while *Reader* and *Writer* - work on the backend — the data in *stream*. + *encode* and *decode* work on the frontend — the data visible to + code calling :meth:`~StreamReader.read` and :meth:`~StreamWriter.write`, + while *Reader* and *Writer* + work on the backend — the data in *stream*. You can use these objects to do transparent transcodings, e.g., from Latin-1 to UTF-8 and back. @@ -1417,8 +1420,10 @@ to :class:`bytes` mappings. They are not supported by :meth:`bytes.decode` | | quotedprintable, | quoted printable. | ``quotetabs=True`` / | | | quoted_printable | | :meth:`quopri.decode` | +----------------------+------------------+------------------------------+------------------------------+ -| uu_codec | uu | Convert the operand using | :meth:`uu.encode` / | -| | | uuencode. | :meth:`uu.decode` | +| uu_codec | uu | Convert the operand using | :meth:`!uu.encode` / | +| | | uuencode. | :meth:`!uu.decode` | +| | | | (Note: :mod:`uu` is | +| | | | deprecated.) | +----------------------+------------------+------------------------------+------------------------------+ | zlib_codec | zip, zlib | Compress the operand using | :meth:`zlib.compress` / | | | | gzip. | :meth:`zlib.decompress` | diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index b49fce00cf4886..d42292aed04ea4 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -43,7 +43,6 @@ Doc/library/bisect.rst Doc/library/bz2.rst Doc/library/calendar.rst Doc/library/cmd.rst -Doc/library/codecs.rst Doc/library/collections.abc.rst Doc/library/collections.rst Doc/library/concurrent.futures.rst From 69bcaf7e0e68f5bdead833a5aa47f67f256c6440 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 20 Oct 2023 14:19:04 +0200 Subject: [PATCH 109/265] gh-110913: Fix WindowsConsoleIO chunking of UTF-8 text (GH-111007) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit (cherry picked from commit 11312eae6ec3acf51aacafce4cb6d1a5edfd5f2e) Co-authored-by: Tamás Hegedűs --- ...-10-19-21-46-18.gh-issue-110913.CWlPfg.rst | 1 + Modules/_io/winconsoleio.c | 36 ++++++++++--------- 2 files changed, 21 insertions(+), 16 deletions(-) create mode 100644 Misc/NEWS.d/next/Windows/2023-10-19-21-46-18.gh-issue-110913.CWlPfg.rst diff --git a/Misc/NEWS.d/next/Windows/2023-10-19-21-46-18.gh-issue-110913.CWlPfg.rst b/Misc/NEWS.d/next/Windows/2023-10-19-21-46-18.gh-issue-110913.CWlPfg.rst new file mode 100644 index 00000000000000..d4c1b56d98ef0e --- /dev/null +++ b/Misc/NEWS.d/next/Windows/2023-10-19-21-46-18.gh-issue-110913.CWlPfg.rst @@ -0,0 +1 @@ +WindowsConsoleIO now correctly chunks large buffers without splitting up UTF-8 sequences. diff --git a/Modules/_io/winconsoleio.c b/Modules/_io/winconsoleio.c index 89431b1e4c3c7b..dcb7d32e78c26f 100644 --- a/Modules/_io/winconsoleio.c +++ b/Modules/_io/winconsoleio.c @@ -132,6 +132,23 @@ char _PyIO_get_console_type(PyObject *path_or_fd) { return m; } +static DWORD +_find_last_utf8_boundary(const char *buf, DWORD len) +{ + /* This function never returns 0, returns the original len instead */ + DWORD count = 1; + if (len == 0 || (buf[len - 1] & 0x80) == 0) { + return len; + } + for (;; count++) { + if (count > 3 || count >= len) { + return len; + } + if ((buf[len - count] & 0xc0) != 0x80) { + return len - count; + } + } +} /*[clinic input] module _io @@ -954,7 +971,7 @@ _io__WindowsConsoleIO_write_impl(winconsoleio *self, Py_buffer *b) { BOOL res = TRUE; wchar_t *wbuf; - DWORD len, wlen, orig_len, n = 0; + DWORD len, wlen, n = 0; HANDLE handle; if (self->fd == -1) @@ -984,21 +1001,8 @@ _io__WindowsConsoleIO_write_impl(winconsoleio *self, Py_buffer *b) have to reduce and recalculate. */ while (wlen > 32766 / sizeof(wchar_t)) { len /= 2; - orig_len = len; - /* Reduce the length until we hit the final byte of a UTF-8 sequence - * (top bit is unset). Fix for github issue 82052. - */ - while (len > 0 && (((char *)b->buf)[len-1] & 0x80) != 0) - --len; - /* If we hit a length of 0, something has gone wrong. This shouldn't - * be possible, as valid UTF-8 can have at most 3 non-final bytes - * before a final one, and our buffer is way longer than that. - * But to be on the safe side, if we hit this issue we just restore - * the original length and let the console API sort it out. - */ - if (len == 0) { - len = orig_len; - } + /* Fix for github issues gh-110913 and gh-82052. */ + len = _find_last_utf8_boundary(b->buf, len); wlen = MultiByteToWideChar(CP_UTF8, 0, b->buf, len, NULL, 0); } Py_END_ALLOW_THREADS From 7213fc248db1c2ae4ee122a3fde69105171428f2 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 20 Oct 2023 16:33:22 +0200 Subject: [PATCH 110/265] [3.11] Synchronize test_contextlib with test_contextlib_async (GH-111000) (GH-111115) (cherry picked from commit ff4e53cb747063e95eaec181fd396f062f885ac2) Co-authored-by: Serhiy Storchaka --- Lib/test/test_contextlib.py | 46 +++++++++++++++++++++++++++++++++++++ 1 file changed, 46 insertions(+) diff --git a/Lib/test/test_contextlib.py b/Lib/test/test_contextlib.py index 093f2593a4a464..0a79b3617defe8 100644 --- a/Lib/test/test_contextlib.py +++ b/Lib/test/test_contextlib.py @@ -162,6 +162,15 @@ def whoo(): # The "gen" attribute is an implementation detail. self.assertFalse(ctx.gen.gi_suspended) + def test_contextmanager_trap_no_yield(self): + @contextmanager + def whoo(): + if False: + yield + ctx = whoo() + with self.assertRaises(RuntimeError): + ctx.__enter__() + def test_contextmanager_trap_second_yield(self): @contextmanager def whoo(): @@ -175,6 +184,19 @@ def whoo(): # The "gen" attribute is an implementation detail. self.assertFalse(ctx.gen.gi_suspended) + def test_contextmanager_non_normalised(self): + @contextmanager + def whoo(): + try: + yield + except RuntimeError: + raise SyntaxError + + ctx = whoo() + ctx.__enter__() + with self.assertRaises(SyntaxError): + ctx.__exit__(RuntimeError, None, None) + def test_contextmanager_except(self): state = [] @contextmanager @@ -254,6 +276,25 @@ def test_issue29692(): self.assertEqual(ex.args[0], 'issue29692:Unchained') self.assertIsNone(ex.__cause__) + def test_contextmanager_wrap_runtimeerror(self): + @contextmanager + def woohoo(): + try: + yield + except Exception as exc: + raise RuntimeError(f'caught {exc}') from exc + + with self.assertRaises(RuntimeError): + with woohoo(): + 1 / 0 + + # If the context manager wrapped StopIteration in a RuntimeError, + # we also unwrap it, because we can't tell whether the wrapping was + # done by the generator machinery or by the generator itself. + with self.assertRaises(StopIteration): + with woohoo(): + raise StopIteration + def _create_contextmanager_attribs(self): def attribs(**kw): def decorate(func): @@ -265,6 +306,7 @@ def decorate(func): @attribs(foo='bar') def baz(spam): """Whee!""" + yield return baz def test_contextmanager_attribs(self): @@ -321,8 +363,11 @@ def woohoo(a, *, b): def test_recursive(self): depth = 0 + ncols = 0 @contextmanager def woohoo(): + nonlocal ncols + ncols += 1 nonlocal depth before = depth depth += 1 @@ -336,6 +381,7 @@ def recursive(): recursive() recursive() + self.assertEqual(ncols, 10) self.assertEqual(depth, 0) From 9addf2cf117c9b14bbc0c7640bb1f1d6300673b7 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 20 Oct 2023 20:28:41 +0200 Subject: [PATCH 111/265] [3.11] gh-111126: Use `isinstance` instead of `assert[Not]IsInstance` in `test_typing` (GH-111127) (#111131) gh-111126: Use `isinstance` instead of `assert[Not]IsInstance` in `test_typing` (GH-111127) (cherry picked from commit ea7c26e4b89c71234c4a603567a93f0a44c9cc97) Co-authored-by: Nikita Sobolev --- Lib/test/test_typing.py | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index 177155a6365662..fa279b557de2c6 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -1944,13 +1944,13 @@ def test_callable_instance_type_error(self): def f(): pass with self.assertRaises(TypeError): - self.assertIsInstance(f, Callable[[], None]) + isinstance(f, Callable[[], None]) with self.assertRaises(TypeError): - self.assertIsInstance(f, Callable[[], Any]) + isinstance(f, Callable[[], Any]) with self.assertRaises(TypeError): - self.assertNotIsInstance(None, Callable[[], None]) + isinstance(None, Callable[[], None]) with self.assertRaises(TypeError): - self.assertNotIsInstance(None, Callable[[], Any]) + isinstance(None, Callable[[], Any]) def test_repr(self): Callable = self.Callable From 47670fbdd0b6c42a0d026f26c67178371b86069a Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 21 Oct 2023 11:01:10 +0200 Subject: [PATCH 112/265] [3.11] gh-110932: Fix regrtest for SOURCE_DATE_EPOCH (GH-111143) (#111153) gh-110932: Fix regrtest for SOURCE_DATE_EPOCH (GH-111143) If the SOURCE_DATE_EPOCH environment variable is defined, use its value as the random seed. (cherry picked from commit 7237fb578dc9db9dc557759a24d8083425107b91) Co-authored-by: Victor Stinner --- Lib/test/libregrtest/main.py | 21 +++--- Lib/test/libregrtest/runtests.py | 2 +- Lib/test/test_regrtest.py | 72 +++++++++++++++---- ...-10-21-00-10-36.gh-issue-110932.jktjJU.rst | 2 + 4 files changed, 75 insertions(+), 22 deletions(-) create mode 100644 Misc/NEWS.d/next/Tests/2023-10-21-00-10-36.gh-issue-110932.jktjJU.rst diff --git a/Lib/test/libregrtest/main.py b/Lib/test/libregrtest/main.py index fe35df05c80e89..e765ed5c613acb 100644 --- a/Lib/test/libregrtest/main.py +++ b/Lib/test/libregrtest/main.py @@ -129,14 +129,19 @@ def __init__(self, ns: Namespace, _add_python_opts: bool = False): # Randomize self.randomize: bool = ns.randomize - self.random_seed: int | None = ( - ns.random_seed - if ns.random_seed is not None - else random.getrandbits(32) - ) - if 'SOURCE_DATE_EPOCH' in os.environ: + if ('SOURCE_DATE_EPOCH' in os.environ + # don't use the variable if empty + and os.environ['SOURCE_DATE_EPOCH'] + ): self.randomize = False - self.random_seed = None + # SOURCE_DATE_EPOCH should be an integer, but use a string to not + # fail if it's not integer. random.seed() accepts a string. + # https://reproducible-builds.org/docs/source-date-epoch/ + self.random_seed: int | str = os.environ['SOURCE_DATE_EPOCH'] + elif ns.random_seed is None: + self.random_seed = random.getrandbits(32) + else: + self.random_seed = ns.random_seed # tests self.first_runtests: RunTests | None = None @@ -441,7 +446,7 @@ def _run_tests(self, selected: TestTuple, tests: TestList | None) -> int: or tests or self.cmdline_args)): display_header(self.use_resources, self.python_cmd) - print("Using random seed", self.random_seed) + print("Using random seed:", self.random_seed) runtests = self.create_run_tests(selected) self.first_runtests = runtests diff --git a/Lib/test/libregrtest/runtests.py b/Lib/test/libregrtest/runtests.py index 4da312db4cb02e..893b311c31297c 100644 --- a/Lib/test/libregrtest/runtests.py +++ b/Lib/test/libregrtest/runtests.py @@ -91,7 +91,7 @@ class RunTests: use_resources: tuple[str, ...] python_cmd: tuple[str, ...] | None randomize: bool - random_seed: int | None + random_seed: int | str json_file: JsonFile | None def copy(self, **override): diff --git a/Lib/test/test_regrtest.py b/Lib/test/test_regrtest.py index c8e182397c835e..91f2fb0286adfe 100644 --- a/Lib/test/test_regrtest.py +++ b/Lib/test/test_regrtest.py @@ -143,18 +143,26 @@ def test_header(self): self.assertTrue(ns.header) def test_randomize(self): - for opt in '-r', '--randomize': + for opt in ('-r', '--randomize'): with self.subTest(opt=opt): ns = self.parse_args([opt]) self.assertTrue(ns.randomize) with os_helper.EnvironmentVarGuard() as env: - env['SOURCE_DATE_EPOCH'] = '1' - + # with SOURCE_DATE_EPOCH + env['SOURCE_DATE_EPOCH'] = '1697839080' ns = self.parse_args(['--randomize']) regrtest = main.Regrtest(ns) self.assertFalse(regrtest.randomize) - self.assertIsNone(regrtest.random_seed) + self.assertIsInstance(regrtest.random_seed, str) + self.assertEqual(regrtest.random_seed, '1697839080') + + # without SOURCE_DATE_EPOCH + del env['SOURCE_DATE_EPOCH'] + ns = self.parse_args(['--randomize']) + regrtest = main.Regrtest(ns) + self.assertTrue(regrtest.randomize) + self.assertIsInstance(regrtest.random_seed, int) def test_randseed(self): ns = self.parse_args(['--randseed', '12345']) @@ -388,7 +396,13 @@ def check_ci_mode(self, args, use_resources, rerun=True): # Check Regrtest attributes which are more reliable than Namespace # which has an unclear API - regrtest = main.Regrtest(ns) + with os_helper.EnvironmentVarGuard() as env: + # Ignore SOURCE_DATE_EPOCH env var if it's set + if 'SOURCE_DATE_EPOCH' in env: + del env['SOURCE_DATE_EPOCH'] + + regrtest = main.Regrtest(ns) + self.assertEqual(regrtest.num_workers, -1) self.assertEqual(regrtest.want_rerun, rerun) self.assertTrue(regrtest.randomize) @@ -662,21 +676,26 @@ def list_regex(line_format, tests): state = f'{state} then {new_state}' self.check_line(output, f'Result: {state}', full=True) - def parse_random_seed(self, output): - match = self.regex_search(r'Using random seed ([0-9]+)', output) - randseed = int(match.group(1)) - self.assertTrue(0 <= randseed, randseed) - return randseed + def parse_random_seed(self, output: str) -> str: + match = self.regex_search(r'Using random seed: (.*)', output) + return match.group(1) def run_command(self, args, input=None, exitcode=0, **kw): if not input: input = '' if 'stderr' not in kw: kw['stderr'] = subprocess.STDOUT + + env = kw.pop('env', None) + if env is None: + env = dict(os.environ) + env.pop('SOURCE_DATE_EPOCH', None) + proc = subprocess.run(args, text=True, input=input, stdout=subprocess.PIPE, + env=env, **kw) if proc.returncode != exitcode: msg = ("Command %s failed with exit code %s, but exit code %s expected!\n" @@ -752,7 +771,9 @@ def setUp(self): self.regrtest_args.append('-n') def check_output(self, output): - self.parse_random_seed(output) + randseed = self.parse_random_seed(output) + self.assertTrue(randseed.isdigit(), randseed) + self.check_executed_tests(output, self.tests, randomize=True, stats=len(self.tests)) @@ -943,7 +964,7 @@ def test_random(self): test_random = int(match.group(1)) # try to reproduce with the random seed - output = self.run_tests('-r', '--randseed=%s' % randseed, test, + output = self.run_tests('-r', f'--randseed={randseed}', test, exitcode=EXITCODE_NO_TESTS_RAN) randseed2 = self.parse_random_seed(output) self.assertEqual(randseed2, randseed) @@ -954,7 +975,32 @@ def test_random(self): # check that random.seed is used by default output = self.run_tests(test, exitcode=EXITCODE_NO_TESTS_RAN) - self.assertIsInstance(self.parse_random_seed(output), int) + randseed = self.parse_random_seed(output) + self.assertTrue(randseed.isdigit(), randseed) + + # check SOURCE_DATE_EPOCH (integer) + timestamp = '1697839080' + env = dict(os.environ, SOURCE_DATE_EPOCH=timestamp) + output = self.run_tests('-r', test, exitcode=EXITCODE_NO_TESTS_RAN, + env=env) + randseed = self.parse_random_seed(output) + self.assertEqual(randseed, timestamp) + self.check_line(output, 'TESTRANDOM: 520') + + # check SOURCE_DATE_EPOCH (string) + env = dict(os.environ, SOURCE_DATE_EPOCH='XYZ') + output = self.run_tests('-r', test, exitcode=EXITCODE_NO_TESTS_RAN, + env=env) + randseed = self.parse_random_seed(output) + self.assertEqual(randseed, 'XYZ') + self.check_line(output, 'TESTRANDOM: 22') + + # check SOURCE_DATE_EPOCH (empty string): ignore the env var + env = dict(os.environ, SOURCE_DATE_EPOCH='') + output = self.run_tests('-r', test, exitcode=EXITCODE_NO_TESTS_RAN, + env=env) + randseed = self.parse_random_seed(output) + self.assertTrue(randseed.isdigit(), randseed) def test_fromfile(self): # test --fromfile diff --git a/Misc/NEWS.d/next/Tests/2023-10-21-00-10-36.gh-issue-110932.jktjJU.rst b/Misc/NEWS.d/next/Tests/2023-10-21-00-10-36.gh-issue-110932.jktjJU.rst new file mode 100644 index 00000000000000..45bb0774a9abe3 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-10-21-00-10-36.gh-issue-110932.jktjJU.rst @@ -0,0 +1,2 @@ +Fix regrtest if the ``SOURCE_DATE_EPOCH`` environment variable is defined: +use the variable value as the random seed. Patch by Victor Stinner. From 17b8e35b65315efbb859100c9d0393eed338a815 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 21 Oct 2023 19:06:51 +0200 Subject: [PATCH 113/265] [3.11] gh-111157: Mention `__notes__` in `traceback.format_exception_only` docstring (GH-111158) (#111164) gh-111157: Mention `__notes__` in `traceback.format_exception_only` docstring (GH-111158) (cherry picked from commit 5e7727b05232b43589d177c15263d7f4f8c584a0) Co-authored-by: Nikita Sobolev --- Lib/traceback.py | 27 ++++++++++++--------------- 1 file changed, 12 insertions(+), 15 deletions(-) diff --git a/Lib/traceback.py b/Lib/traceback.py index d3edd3a63efed9..0e229553cb5d25 100644 --- a/Lib/traceback.py +++ b/Lib/traceback.py @@ -145,14 +145,11 @@ def format_exception_only(exc, /, value=_sentinel): The return value is a list of strings, each ending in a newline. - Normally, the list contains a single string; however, for - SyntaxError exceptions, it contains several lines that (when - printed) display detailed information about where the syntax - error occurred. - - The message indicating which exception occurred is always the last - string in the list. - + The list contains the exception's message, which is + normally a single string; however, for :exc:`SyntaxError` exceptions, it + contains several lines that (when printed) display detailed information + about where the syntax error occurred. Following the message, the list + contains the exception's ``__notes__``. """ if value is _sentinel: value = exc @@ -817,13 +814,13 @@ def format_exception_only(self): The return value is a generator of strings, each ending in a newline. - Normally, the generator emits a single string; however, for - SyntaxError exceptions, it emits several lines that (when - printed) display detailed information about where the syntax - error occurred. - - The message indicating which exception occurred is always the last - string in the output. + Generator yields the exception message. + For :exc:`SyntaxError` exceptions, it + also yields (before the exception message) + several lines that (when printed) + display detailed information about where the syntax error occurred. + Following the message, generator also yields + all the exception's ``__notes__``. """ if self.exc_type is None: yield _format_final_exc_line(None, self._str) From 4222dd93af710ba9f427d4f5d95a9241298ff0db Mon Sep 17 00:00:00 2001 From: Serhiy Storchaka Date: Sat, 21 Oct 2023 20:30:19 +0300 Subject: [PATCH 114/265] [3.11] gh-110918: regrtest: allow to intermix --match and --ignore options (GH-110919) (GH-111168) Test case matching patterns specified by options --match, --ignore, --matchfile and --ignorefile are now tested in the order of specification, and the last match determines whether the test case be run or ignored. (cherry picked from commit 9a1fe09622cd0f1e24c2ba5335c94c5d70306fd0) --- Lib/test/libregrtest/cmdline.py | 40 +++++----- Lib/test/libregrtest/findtests.py | 7 +- Lib/test/libregrtest/main.py | 15 +--- Lib/test/libregrtest/run_workers.py | 2 +- Lib/test/libregrtest/runtests.py | 5 +- Lib/test/libregrtest/setup.py | 2 +- Lib/test/libregrtest/utils.py | 1 + Lib/test/libregrtest/worker.py | 6 +- Lib/test/support/__init__.py | 76 +++++++------------ Lib/test/test_regrtest.py | 42 +++++----- Lib/test/test_support.py | 67 +++++++++------- ...-10-16-13-47-24.gh-issue-110918.aFgZK3.rst | 4 + 12 files changed, 126 insertions(+), 141 deletions(-) create mode 100644 Misc/NEWS.d/next/Tests/2023-10-16-13-47-24.gh-issue-110918.aFgZK3.rst diff --git a/Lib/test/libregrtest/cmdline.py b/Lib/test/libregrtest/cmdline.py index b1b00893402d8e..905b3f862281f1 100644 --- a/Lib/test/libregrtest/cmdline.py +++ b/Lib/test/libregrtest/cmdline.py @@ -161,8 +161,7 @@ def __init__(self, **kwargs) -> None: self.forever = False self.header = False self.failfast = False - self.match_tests = None - self.ignore_tests = None + self.match_tests = [] self.pgo = False self.pgo_extended = False self.worker_json = None @@ -183,6 +182,20 @@ def error(self, message): super().error(message + "\nPass -h or --help for complete help.") +class FilterAction(argparse.Action): + def __call__(self, parser, namespace, value, option_string=None): + items = getattr(namespace, self.dest) + items.append((value, self.const)) + + +class FromFileFilterAction(argparse.Action): + def __call__(self, parser, namespace, value, option_string=None): + items = getattr(namespace, self.dest) + with open(value, encoding='utf-8') as fp: + for line in fp: + items.append((line.strip(), self.const)) + + def _create_parser(): # Set prog to prevent the uninformative "__main__.py" from displaying in # error messages when using "python -m test ...". @@ -192,6 +205,7 @@ def _create_parser(): epilog=EPILOG, add_help=False, formatter_class=argparse.RawDescriptionHelpFormatter) + parser.set_defaults(match_tests=[]) # Arguments with this clause added to its help are described further in # the epilog's "Additional option details" section. @@ -251,17 +265,19 @@ def _create_parser(): help='single step through a set of tests.' + more_details) group.add_argument('-m', '--match', metavar='PAT', - dest='match_tests', action='append', + dest='match_tests', action=FilterAction, const=True, help='match test cases and methods with glob pattern PAT') group.add_argument('-i', '--ignore', metavar='PAT', - dest='ignore_tests', action='append', + dest='match_tests', action=FilterAction, const=False, help='ignore test cases and methods with glob pattern PAT') group.add_argument('--matchfile', metavar='FILENAME', - dest='match_filename', + dest='match_tests', + action=FromFileFilterAction, const=True, help='similar to --match but get patterns from a ' 'text file, one pattern per line') group.add_argument('--ignorefile', metavar='FILENAME', - dest='ignore_filename', + dest='match_tests', + action=FromFileFilterAction, const=False, help='similar to --matchfile but it receives patterns ' 'from text file to ignore') group.add_argument('-G', '--failfast', action='store_true', @@ -483,18 +499,6 @@ def _parse_args(args, **kwargs): print("WARNING: Disable --verbose3 because it's incompatible with " "--huntrleaks: see http://bugs.python.org/issue27103", file=sys.stderr) - if ns.match_filename: - if ns.match_tests is None: - ns.match_tests = [] - with open(ns.match_filename) as fp: - for line in fp: - ns.match_tests.append(line.strip()) - if ns.ignore_filename: - if ns.ignore_tests is None: - ns.ignore_tests = [] - with open(ns.ignore_filename) as fp: - for line in fp: - ns.ignore_tests.append(line.strip()) if ns.forever: # --forever implies --failfast ns.failfast = True diff --git a/Lib/test/libregrtest/findtests.py b/Lib/test/libregrtest/findtests.py index 96cc3e0d021184..dd39fe1ce20a77 100644 --- a/Lib/test/libregrtest/findtests.py +++ b/Lib/test/libregrtest/findtests.py @@ -5,7 +5,7 @@ from test import support from .utils import ( - StrPath, TestName, TestTuple, TestList, FilterTuple, + StrPath, TestName, TestTuple, TestList, TestFilter, abs_module_name, count, printlist) @@ -82,11 +82,10 @@ def _list_cases(suite): print(test.id()) def list_cases(tests: TestTuple, *, - match_tests: FilterTuple | None = None, - ignore_tests: FilterTuple | None = None, + match_tests: TestFilter | None = None, test_dir: StrPath | None = None): support.verbose = False - support.set_match_tests(match_tests, ignore_tests) + support.set_match_tests(match_tests) skipped = [] for test_name in tests: diff --git a/Lib/test/libregrtest/main.py b/Lib/test/libregrtest/main.py index e765ed5c613acb..8544bb484c8be0 100644 --- a/Lib/test/libregrtest/main.py +++ b/Lib/test/libregrtest/main.py @@ -19,7 +19,7 @@ from .setup import setup_process, setup_test_dir from .single import run_single_test, PROGRESS_MIN_TIME from .utils import ( - StrPath, StrJSON, TestName, TestList, TestTuple, FilterTuple, + StrPath, StrJSON, TestName, TestList, TestTuple, TestFilter, strip_py_suffix, count, format_duration, printlist, get_temp_dir, get_work_dir, exit_timeout, display_header, cleanup_temp_dir, print_warning, @@ -78,14 +78,7 @@ def __init__(self, ns: Namespace, _add_python_opts: bool = False): and ns._add_python_opts) # Select tests - if ns.match_tests: - self.match_tests: FilterTuple | None = tuple(ns.match_tests) - else: - self.match_tests = None - if ns.ignore_tests: - self.ignore_tests: FilterTuple | None = tuple(ns.ignore_tests) - else: - self.ignore_tests = None + self.match_tests: TestFilter = ns.match_tests self.exclude: bool = ns.exclude self.fromfile: StrPath | None = ns.fromfile self.starting_test: TestName | None = ns.start @@ -389,7 +382,7 @@ def finalize_tests(self, tracer): def display_summary(self): duration = time.perf_counter() - self.logger.start_time - filtered = bool(self.match_tests) or bool(self.ignore_tests) + filtered = bool(self.match_tests) # Total duration print() @@ -407,7 +400,6 @@ def create_run_tests(self, tests: TestTuple): fail_fast=self.fail_fast, fail_env_changed=self.fail_env_changed, match_tests=self.match_tests, - ignore_tests=self.ignore_tests, match_tests_dict=None, rerun=False, forever=self.forever, @@ -660,7 +652,6 @@ def main(self, tests: TestList | None = None): elif self.want_list_cases: list_cases(selected, match_tests=self.match_tests, - ignore_tests=self.ignore_tests, test_dir=self.test_dir) else: exitcode = self.run_tests(selected, tests) diff --git a/Lib/test/libregrtest/run_workers.py b/Lib/test/libregrtest/run_workers.py index 16f8331abd32f9..ab03cb54d6122e 100644 --- a/Lib/test/libregrtest/run_workers.py +++ b/Lib/test/libregrtest/run_workers.py @@ -261,7 +261,7 @@ def create_worker_runtests(self, test_name: TestName, json_file: JsonFile) -> Ru kwargs = {} if match_tests: - kwargs['match_tests'] = match_tests + kwargs['match_tests'] = [(test, True) for test in match_tests] if self.runtests.output_on_failure: kwargs['verbose'] = True kwargs['output_on_failure'] = False diff --git a/Lib/test/libregrtest/runtests.py b/Lib/test/libregrtest/runtests.py index 893b311c31297c..bfed1b4a2a5817 100644 --- a/Lib/test/libregrtest/runtests.py +++ b/Lib/test/libregrtest/runtests.py @@ -8,7 +8,7 @@ from test import support from .utils import ( - StrPath, StrJSON, TestTuple, FilterTuple, FilterDict) + StrPath, StrJSON, TestTuple, TestFilter, FilterTuple, FilterDict) class JsonFileType: @@ -72,8 +72,7 @@ class RunTests: tests: TestTuple fail_fast: bool fail_env_changed: bool - match_tests: FilterTuple | None - ignore_tests: FilterTuple | None + match_tests: TestFilter match_tests_dict: FilterDict | None rerun: bool forever: bool diff --git a/Lib/test/libregrtest/setup.py b/Lib/test/libregrtest/setup.py index 793347f60ad93c..6a96b051394d20 100644 --- a/Lib/test/libregrtest/setup.py +++ b/Lib/test/libregrtest/setup.py @@ -92,7 +92,7 @@ def setup_tests(runtests: RunTests): support.PGO = runtests.pgo support.PGO_EXTENDED = runtests.pgo_extended - support.set_match_tests(runtests.match_tests, runtests.ignore_tests) + support.set_match_tests(runtests.match_tests) if runtests.use_junit: support.junit_xml_list = [] diff --git a/Lib/test/libregrtest/utils.py b/Lib/test/libregrtest/utils.py index f62184ad4fec6c..f6de62af1f0ed5 100644 --- a/Lib/test/libregrtest/utils.py +++ b/Lib/test/libregrtest/utils.py @@ -52,6 +52,7 @@ TestList = list[TestName] # --match and --ignore options: list of patterns # ('*' joker character can be used) +TestFilter = list[tuple[TestName, bool]] FilterTuple = tuple[TestName, ...] FilterDict = dict[TestName, FilterTuple] diff --git a/Lib/test/libregrtest/worker.py b/Lib/test/libregrtest/worker.py index a9c8be0bb65d08..2eccfabc25223a 100644 --- a/Lib/test/libregrtest/worker.py +++ b/Lib/test/libregrtest/worker.py @@ -10,7 +10,7 @@ from .runtests import RunTests, JsonFile, JsonFileType from .single import run_single_test from .utils import ( - StrPath, StrJSON, FilterTuple, + StrPath, StrJSON, TestFilter, get_temp_dir, get_work_dir, exit_timeout) @@ -73,7 +73,7 @@ def create_worker_process(runtests: RunTests, output_fd: int, def worker_process(worker_json: StrJSON) -> NoReturn: runtests = RunTests.from_json(worker_json) test_name = runtests.tests[0] - match_tests: FilterTuple | None = runtests.match_tests + match_tests: TestFilter = runtests.match_tests json_file: JsonFile = runtests.json_file setup_test_dir(runtests.test_dir) @@ -81,7 +81,7 @@ def worker_process(worker_json: StrJSON) -> NoReturn: if runtests.rerun: if match_tests: - matching = "matching: " + ", ".join(match_tests) + matching = "matching: " + ", ".join(pattern for pattern, result in match_tests if result) print(f"Re-running {test_name} in verbose mode ({matching})", flush=True) else: print(f"Re-running {test_name} in verbose mode", flush=True) diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index 9066f0cd5b3819..2d53f635cef7d3 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -6,7 +6,9 @@ import contextlib import dataclasses import functools +import itertools import getpass +import operator import os import re import stat @@ -1172,18 +1174,17 @@ def _run_suite(suite): # By default, don't filter tests -_match_test_func = None - -_accept_test_patterns = None -_ignore_test_patterns = None +_test_matchers = () +_test_patterns = () def match_test(test): # Function used by support.run_unittest() and regrtest --list-cases - if _match_test_func is None: - return True - else: - return _match_test_func(test.id()) + result = False + for matcher, result in reversed(_test_matchers): + if matcher(test.id()): + return result + return not result def _is_full_match_test(pattern): @@ -1196,47 +1197,30 @@ def _is_full_match_test(pattern): return ('.' in pattern) and (not re.search(r'[?*\[\]]', pattern)) -def set_match_tests(accept_patterns=None, ignore_patterns=None): - global _match_test_func, _accept_test_patterns, _ignore_test_patterns - - if accept_patterns is None: - accept_patterns = () - if ignore_patterns is None: - ignore_patterns = () - - accept_func = ignore_func = None - - if accept_patterns != _accept_test_patterns: - accept_patterns, accept_func = _compile_match_function(accept_patterns) - if ignore_patterns != _ignore_test_patterns: - ignore_patterns, ignore_func = _compile_match_function(ignore_patterns) - - # Create a copy since patterns can be mutable and so modified later - _accept_test_patterns = tuple(accept_patterns) - _ignore_test_patterns = tuple(ignore_patterns) +def set_match_tests(patterns): + global _test_matchers, _test_patterns - if accept_func is not None or ignore_func is not None: - def match_function(test_id): - accept = True - ignore = False - if accept_func: - accept = accept_func(test_id) - if ignore_func: - ignore = ignore_func(test_id) - return accept and not ignore - - _match_test_func = match_function + if not patterns: + _test_matchers = () + _test_patterns = () + else: + itemgetter = operator.itemgetter + patterns = tuple(patterns) + if patterns != _test_patterns: + _test_matchers = [ + (_compile_match_function(map(itemgetter(0), it)), result) + for result, it in itertools.groupby(patterns, itemgetter(1)) + ] + _test_patterns = patterns def _compile_match_function(patterns): - if not patterns: - func = None - # set_match_tests(None) behaves as set_match_tests(()) - patterns = () - elif all(map(_is_full_match_test, patterns)): + patterns = list(patterns) + + if all(map(_is_full_match_test, patterns)): # Simple case: all patterns are full test identifier. # The test.bisect_cmd utility only uses such full test identifiers. - func = set(patterns).__contains__ + return set(patterns).__contains__ else: import fnmatch regex = '|'.join(map(fnmatch.translate, patterns)) @@ -1244,7 +1228,7 @@ def _compile_match_function(patterns): # don't use flags=re.IGNORECASE regex_match = re.compile(regex).match - def match_test_regex(test_id): + def match_test_regex(test_id, regex_match=regex_match): if regex_match(test_id): # The regex matches the whole identifier, for example # 'test.test_os.FileTests.test_access'. @@ -1255,9 +1239,7 @@ def match_test_regex(test_id): # into: 'test', 'test_os', 'FileTests' and 'test_access'. return any(map(regex_match, test_id.split("."))) - func = match_test_regex - - return patterns, func + return match_test_regex def run_unittest(*classes): diff --git a/Lib/test/test_regrtest.py b/Lib/test/test_regrtest.py index 91f2fb0286adfe..0c12b0efff88e5 100644 --- a/Lib/test/test_regrtest.py +++ b/Lib/test/test_regrtest.py @@ -192,34 +192,27 @@ def test_single(self): self.assertTrue(ns.single) self.checkError([opt, '-f', 'foo'], "don't go together") - def test_ignore(self): - for opt in '-i', '--ignore': + def test_match(self): + for opt in '-m', '--match': with self.subTest(opt=opt): ns = self.parse_args([opt, 'pattern']) - self.assertEqual(ns.ignore_tests, ['pattern']) + self.assertEqual(ns.match_tests, [('pattern', True)]) self.checkError([opt], 'expected one argument') - self.addCleanup(os_helper.unlink, os_helper.TESTFN) - with open(os_helper.TESTFN, "w") as fp: - print('matchfile1', file=fp) - print('matchfile2', file=fp) - - filename = os.path.abspath(os_helper.TESTFN) - ns = self.parse_args(['-m', 'match', - '--ignorefile', filename]) - self.assertEqual(ns.ignore_tests, - ['matchfile1', 'matchfile2']) - - def test_match(self): - for opt in '-m', '--match': + for opt in '-i', '--ignore': with self.subTest(opt=opt): ns = self.parse_args([opt, 'pattern']) - self.assertEqual(ns.match_tests, ['pattern']) + self.assertEqual(ns.match_tests, [('pattern', False)]) self.checkError([opt], 'expected one argument') - ns = self.parse_args(['-m', 'pattern1', - '-m', 'pattern2']) - self.assertEqual(ns.match_tests, ['pattern1', 'pattern2']) + ns = self.parse_args(['-m', 'pattern1', '-m', 'pattern2']) + self.assertEqual(ns.match_tests, [('pattern1', True), ('pattern2', True)]) + + ns = self.parse_args(['-m', 'pattern1', '-i', 'pattern2']) + self.assertEqual(ns.match_tests, [('pattern1', True), ('pattern2', False)]) + + ns = self.parse_args(['-i', 'pattern1', '-m', 'pattern2']) + self.assertEqual(ns.match_tests, [('pattern1', False), ('pattern2', True)]) self.addCleanup(os_helper.unlink, os_helper.TESTFN) with open(os_helper.TESTFN, "w") as fp: @@ -227,10 +220,13 @@ def test_match(self): print('matchfile2', file=fp) filename = os.path.abspath(os_helper.TESTFN) - ns = self.parse_args(['-m', 'match', - '--matchfile', filename]) + ns = self.parse_args(['-m', 'match', '--matchfile', filename]) + self.assertEqual(ns.match_tests, + [('match', True), ('matchfile1', True), ('matchfile2', True)]) + + ns = self.parse_args(['-i', 'match', '--ignorefile', filename]) self.assertEqual(ns.match_tests, - ['match', 'matchfile1', 'matchfile2']) + [('match', False), ('matchfile1', False), ('matchfile2', False)]) def test_failfast(self): for opt in '-G', '--failfast': diff --git a/Lib/test/test_support.py b/Lib/test/test_support.py index 01ceeec3c0a73c..3f9159642ba178 100644 --- a/Lib/test/test_support.py +++ b/Lib/test/test_support.py @@ -560,100 +560,109 @@ def id(self): test_access = Test('test.test_os.FileTests.test_access') test_chdir = Test('test.test_os.Win32ErrorTests.test_chdir') + test_copy = Test('test.test_shutil.TestCopy.test_copy') # Test acceptance - with support.swap_attr(support, '_match_test_func', None): + with support.swap_attr(support, '_test_matchers', ()): # match all support.set_match_tests([]) self.assertTrue(support.match_test(test_access)) self.assertTrue(support.match_test(test_chdir)) # match all using None - support.set_match_tests(None, None) + support.set_match_tests(None) self.assertTrue(support.match_test(test_access)) self.assertTrue(support.match_test(test_chdir)) # match the full test identifier - support.set_match_tests([test_access.id()], None) + support.set_match_tests([(test_access.id(), True)]) self.assertTrue(support.match_test(test_access)) self.assertFalse(support.match_test(test_chdir)) # match the module name - support.set_match_tests(['test_os'], None) + support.set_match_tests([('test_os', True)]) self.assertTrue(support.match_test(test_access)) self.assertTrue(support.match_test(test_chdir)) + self.assertFalse(support.match_test(test_copy)) # Test '*' pattern - support.set_match_tests(['test_*'], None) + support.set_match_tests([('test_*', True)]) self.assertTrue(support.match_test(test_access)) self.assertTrue(support.match_test(test_chdir)) # Test case sensitivity - support.set_match_tests(['filetests'], None) + support.set_match_tests([('filetests', True)]) self.assertFalse(support.match_test(test_access)) - support.set_match_tests(['FileTests'], None) + support.set_match_tests([('FileTests', True)]) self.assertTrue(support.match_test(test_access)) # Test pattern containing '.' and a '*' metacharacter - support.set_match_tests(['*test_os.*.test_*'], None) + support.set_match_tests([('*test_os.*.test_*', True)]) self.assertTrue(support.match_test(test_access)) self.assertTrue(support.match_test(test_chdir)) + self.assertFalse(support.match_test(test_copy)) # Multiple patterns - support.set_match_tests([test_access.id(), test_chdir.id()], None) + support.set_match_tests([(test_access.id(), True), (test_chdir.id(), True)]) self.assertTrue(support.match_test(test_access)) self.assertTrue(support.match_test(test_chdir)) + self.assertFalse(support.match_test(test_copy)) - support.set_match_tests(['test_access', 'DONTMATCH'], None) + support.set_match_tests([('test_access', True), ('DONTMATCH', True)]) self.assertTrue(support.match_test(test_access)) self.assertFalse(support.match_test(test_chdir)) # Test rejection - with support.swap_attr(support, '_match_test_func', None): - # match all - support.set_match_tests(ignore_patterns=[]) - self.assertTrue(support.match_test(test_access)) - self.assertTrue(support.match_test(test_chdir)) - - # match all using None - support.set_match_tests(None, None) - self.assertTrue(support.match_test(test_access)) - self.assertTrue(support.match_test(test_chdir)) - + with support.swap_attr(support, '_test_matchers', ()): # match the full test identifier - support.set_match_tests(None, [test_access.id()]) + support.set_match_tests([(test_access.id(), False)]) self.assertFalse(support.match_test(test_access)) self.assertTrue(support.match_test(test_chdir)) # match the module name - support.set_match_tests(None, ['test_os']) + support.set_match_tests([('test_os', False)]) self.assertFalse(support.match_test(test_access)) self.assertFalse(support.match_test(test_chdir)) + self.assertTrue(support.match_test(test_copy)) # Test '*' pattern - support.set_match_tests(None, ['test_*']) + support.set_match_tests([('test_*', False)]) self.assertFalse(support.match_test(test_access)) self.assertFalse(support.match_test(test_chdir)) # Test case sensitivity - support.set_match_tests(None, ['filetests']) + support.set_match_tests([('filetests', False)]) self.assertTrue(support.match_test(test_access)) - support.set_match_tests(None, ['FileTests']) + support.set_match_tests([('FileTests', False)]) self.assertFalse(support.match_test(test_access)) # Test pattern containing '.' and a '*' metacharacter - support.set_match_tests(None, ['*test_os.*.test_*']) + support.set_match_tests([('*test_os.*.test_*', False)]) self.assertFalse(support.match_test(test_access)) self.assertFalse(support.match_test(test_chdir)) + self.assertTrue(support.match_test(test_copy)) # Multiple patterns - support.set_match_tests(None, [test_access.id(), test_chdir.id()]) + support.set_match_tests([(test_access.id(), False), (test_chdir.id(), False)]) + self.assertFalse(support.match_test(test_access)) + self.assertFalse(support.match_test(test_chdir)) + self.assertTrue(support.match_test(test_copy)) + + support.set_match_tests([('test_access', False), ('DONTMATCH', False)]) self.assertFalse(support.match_test(test_access)) + self.assertTrue(support.match_test(test_chdir)) + + # Test mixed filters + with support.swap_attr(support, '_test_matchers', ()): + support.set_match_tests([('*test_os', False), ('test_access', True)]) + self.assertTrue(support.match_test(test_access)) self.assertFalse(support.match_test(test_chdir)) + self.assertTrue(support.match_test(test_copy)) - support.set_match_tests(None, ['test_access', 'DONTMATCH']) + support.set_match_tests([('*test_os', True), ('test_access', False)]) self.assertFalse(support.match_test(test_access)) self.assertTrue(support.match_test(test_chdir)) + self.assertFalse(support.match_test(test_copy)) @unittest.skipIf(support.is_emscripten, "Unstable in Emscripten") @unittest.skipIf(support.is_wasi, "Unavailable on WASI") diff --git a/Misc/NEWS.d/next/Tests/2023-10-16-13-47-24.gh-issue-110918.aFgZK3.rst b/Misc/NEWS.d/next/Tests/2023-10-16-13-47-24.gh-issue-110918.aFgZK3.rst new file mode 100644 index 00000000000000..7cb79c0cbf29f1 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-10-16-13-47-24.gh-issue-110918.aFgZK3.rst @@ -0,0 +1,4 @@ +Test case matching patterns specified by options ``--match``, ``--ignore``, +``--matchfile`` and ``--ignorefile`` are now tested in the order of +specification, and the last match determines whether the test case be run or +ignored. From cf28c61c73174638b7596d9fff70f3c4e4965b30 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 21 Oct 2023 20:23:38 +0200 Subject: [PATCH 115/265] [3.11] gh-111159: Fix `doctest` output comparison for exceptions with notes (GH-111160) (#111170) gh-111159: Fix `doctest` output comparison for exceptions with notes (GH-111160) (cherry picked from commit fd60549c0ac6c81f05594a5141d24b4433ae39be) Co-authored-by: Nikita Sobolev --- Lib/doctest.py | 15 +- Lib/test/test_doctest.py | 144 ++++++++++++++++++ ...-10-21-13-57-06.gh-issue-111159.GoHp7s.rst | 1 + 3 files changed, 159 insertions(+), 1 deletion(-) create mode 100644 Misc/NEWS.d/next/Library/2023-10-21-13-57-06.gh-issue-111159.GoHp7s.rst diff --git a/Lib/doctest.py b/Lib/doctest.py index 8fcb4ee0a02ee0..783664a0d3554e 100644 --- a/Lib/doctest.py +++ b/Lib/doctest.py @@ -1370,7 +1370,20 @@ def __run(self, test, compileflags, out): # The example raised an exception: check if it was expected. else: - exc_msg = traceback.format_exception_only(*exception[:2])[-1] + formatted_ex = traceback.format_exception_only(*exception[:2]) + if issubclass(exception[0], SyntaxError): + # SyntaxError / IndentationError is special: + # we don't care about the carets / suggestions / etc + # We only care about the error message and notes. + # They start with `SyntaxError:` (or any other class name) + exc_msg_index = next( + index + for index, line in enumerate(formatted_ex) + if line.startswith(f"{exception[0].__name__}:") + ) + formatted_ex = formatted_ex[exc_msg_index:] + + exc_msg = "".join(formatted_ex) if not quiet: got += _exception_traceback(exception) diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py index e48d91fafeda7d..fbeed96e5bee36 100644 --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -3168,6 +3168,150 @@ def test_run_doctestsuite_multiple_times(): """ +def test_exception_with_note(note): + """ + >>> test_exception_with_note('Note') + Traceback (most recent call last): + ... + ValueError: Text + Note + + >>> test_exception_with_note('Note') # doctest: +IGNORE_EXCEPTION_DETAIL + Traceback (most recent call last): + ... + ValueError: Text + Note + + >>> test_exception_with_note('''Note + ... multiline + ... example''') + Traceback (most recent call last): + ValueError: Text + Note + multiline + example + + Different note will fail the test: + + >>> def f(x): + ... r''' + ... >>> exc = ValueError('message') + ... >>> exc.add_note('note') + ... >>> raise exc + ... Traceback (most recent call last): + ... ValueError: message + ... wrong note + ... ''' + >>> test = doctest.DocTestFinder().find(f)[0] + >>> doctest.DocTestRunner(verbose=False).run(test) + ... # doctest: +ELLIPSIS + ********************************************************************** + File "...", line 5, in f + Failed example: + raise exc + Expected: + Traceback (most recent call last): + ValueError: message + wrong note + Got: + Traceback (most recent call last): + ... + ValueError: message + note + TestResults(failed=1, attempted=...) + """ + exc = ValueError('Text') + exc.add_note(note) + raise exc + + +def test_exception_with_multiple_notes(): + """ + >>> test_exception_with_multiple_notes() + Traceback (most recent call last): + ... + ValueError: Text + One + Two + """ + exc = ValueError('Text') + exc.add_note('One') + exc.add_note('Two') + raise exc + + +def test_syntax_error_with_note(cls, multiline=False): + """ + >>> test_syntax_error_with_note(SyntaxError) + Traceback (most recent call last): + ... + SyntaxError: error + Note + + >>> test_syntax_error_with_note(SyntaxError) + Traceback (most recent call last): + SyntaxError: error + Note + + >>> test_syntax_error_with_note(SyntaxError) + Traceback (most recent call last): + ... + File "x.py", line 23 + bad syntax + SyntaxError: error + Note + + >>> test_syntax_error_with_note(IndentationError) + Traceback (most recent call last): + ... + IndentationError: error + Note + + >>> test_syntax_error_with_note(TabError, multiline=True) + Traceback (most recent call last): + ... + TabError: error + Note + Line + """ + exc = cls("error", ("x.py", 23, None, "bad syntax")) + exc.add_note('Note\nLine' if multiline else 'Note') + raise exc + + +def test_syntax_error_with_incorrect_expected_note(): + """ + >>> def f(x): + ... r''' + ... >>> exc = SyntaxError("error", ("x.py", 23, None, "bad syntax")) + ... >>> exc.add_note('note1') + ... >>> exc.add_note('note2') + ... >>> raise exc + ... Traceback (most recent call last): + ... SyntaxError: error + ... wrong note + ... ''' + >>> test = doctest.DocTestFinder().find(f)[0] + >>> doctest.DocTestRunner(verbose=False).run(test) + ... # doctest: +ELLIPSIS + ********************************************************************** + File "...", line 6, in f + Failed example: + raise exc + Expected: + Traceback (most recent call last): + SyntaxError: error + wrong note + Got: + Traceback (most recent call last): + ... + SyntaxError: error + note1 + note2 + TestResults(failed=1, attempted=...) + """ + + def load_tests(loader, tests, pattern): tests.addTest(doctest.DocTestSuite(doctest)) tests.addTest(doctest.DocTestSuite()) diff --git a/Misc/NEWS.d/next/Library/2023-10-21-13-57-06.gh-issue-111159.GoHp7s.rst b/Misc/NEWS.d/next/Library/2023-10-21-13-57-06.gh-issue-111159.GoHp7s.rst new file mode 100644 index 00000000000000..bdec4f4443d80b --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-10-21-13-57-06.gh-issue-111159.GoHp7s.rst @@ -0,0 +1 @@ +Fix :mod:`doctest` output comparison for exceptions with notes. From cf777399a93f837bca623e8cf887fc0ec42340e6 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 21 Oct 2023 21:40:07 +0200 Subject: [PATCH 116/265] [3.11] gh-111085: Fix invalid state handling in TaskGroup and Timeout (GH-111111) (GH-111172) asyncio.TaskGroup and asyncio.Timeout classes now raise proper RuntimeError if they are improperly used. * When they are used without entering the context manager. * When they are used after finishing. * When the context manager is entered more than once (simultaneously or sequentially). * If there is no current task when entering the context manager. They now remain in a consistent state after an exception is thrown, so subsequent operations can be performed correctly (if they are allowed). (cherry picked from commit 6c23635f2b7067ef091a550954e09f8b7c329e3f) Co-authored-by: Serhiy Storchaka Co-authored-by: James Hilton-Balfe --- Lib/asyncio/taskgroups.py | 6 +-- Lib/asyncio/timeouts.py | 12 +++-- Lib/test/test_asyncio/test_taskgroups.py | 45 +++++++++++++++++ Lib/test/test_asyncio/test_timeouts.py | 48 ++++++++++++++++++- Lib/test/test_asyncio/utils.py | 15 ++++++ ...-10-20-15-29-10.gh-issue-110910.u2oPwX.rst | 3 ++ 6 files changed, 120 insertions(+), 9 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-10-20-15-29-10.gh-issue-110910.u2oPwX.rst diff --git a/Lib/asyncio/taskgroups.py b/Lib/asyncio/taskgroups.py index 0fdea3697ece3d..bfdbe63049f9bd 100644 --- a/Lib/asyncio/taskgroups.py +++ b/Lib/asyncio/taskgroups.py @@ -54,16 +54,14 @@ def __repr__(self): async def __aenter__(self): if self._entered: raise RuntimeError( - f"TaskGroup {self!r} has been already entered") - self._entered = True - + f"TaskGroup {self!r} has already been entered") if self._loop is None: self._loop = events.get_running_loop() - self._parent_task = tasks.current_task(self._loop) if self._parent_task is None: raise RuntimeError( f'TaskGroup {self!r} cannot determine the parent task') + self._entered = True return self diff --git a/Lib/asyncio/timeouts.py b/Lib/asyncio/timeouts.py index 029c468739bf2d..30042abb3ad804 100644 --- a/Lib/asyncio/timeouts.py +++ b/Lib/asyncio/timeouts.py @@ -49,8 +49,9 @@ def when(self) -> Optional[float]: def reschedule(self, when: Optional[float]) -> None: """Reschedule the timeout.""" - assert self._state is not _State.CREATED if self._state is not _State.ENTERED: + if self._state is _State.CREATED: + raise RuntimeError("Timeout has not been entered") raise RuntimeError( f"Cannot change state of {self._state.value} Timeout", ) @@ -82,11 +83,14 @@ def __repr__(self) -> str: return f"" async def __aenter__(self) -> "Timeout": + if self._state is not _State.CREATED: + raise RuntimeError("Timeout has already been entered") + task = tasks.current_task() + if task is None: + raise RuntimeError("Timeout should be used inside a task") self._state = _State.ENTERED - self._task = tasks.current_task() + self._task = task self._cancelling = self._task.cancelling() - if self._task is None: - raise RuntimeError("Timeout should be used inside a task") self.reschedule(self._when) return self diff --git a/Lib/test/test_asyncio/test_taskgroups.py b/Lib/test/test_asyncio/test_taskgroups.py index 6a0231f2859a62..7a18362b54e469 100644 --- a/Lib/test/test_asyncio/test_taskgroups.py +++ b/Lib/test/test_asyncio/test_taskgroups.py @@ -8,6 +8,8 @@ from asyncio import taskgroups import unittest +from test.test_asyncio.utils import await_without_task + # To prevent a warning "test altered the execution environment" def tearDownModule(): @@ -779,6 +781,49 @@ async def main(): await asyncio.create_task(main()) + async def test_taskgroup_already_entered(self): + tg = taskgroups.TaskGroup() + async with tg: + with self.assertRaisesRegex(RuntimeError, "has already been entered"): + async with tg: + pass + + async def test_taskgroup_double_enter(self): + tg = taskgroups.TaskGroup() + async with tg: + pass + with self.assertRaisesRegex(RuntimeError, "has already been entered"): + async with tg: + pass + + async def test_taskgroup_finished(self): + tg = taskgroups.TaskGroup() + async with tg: + pass + coro = asyncio.sleep(0) + with self.assertRaisesRegex(RuntimeError, "is finished"): + tg.create_task(coro) + # We still have to await coro to avoid a warning + await coro + + async def test_taskgroup_not_entered(self): + tg = taskgroups.TaskGroup() + coro = asyncio.sleep(0) + with self.assertRaisesRegex(RuntimeError, "has not been entered"): + tg.create_task(coro) + # We still have to await coro to avoid a warning + await coro + + async def test_taskgroup_without_parent_task(self): + tg = taskgroups.TaskGroup() + with self.assertRaisesRegex(RuntimeError, "parent task"): + await await_without_task(tg.__aenter__()) + coro = asyncio.sleep(0) + with self.assertRaisesRegex(RuntimeError, "has not been entered"): + tg.create_task(coro) + # We still have to await coro to avoid a warning + await coro + if __name__ == "__main__": unittest.main() diff --git a/Lib/test/test_asyncio/test_timeouts.py b/Lib/test/test_asyncio/test_timeouts.py index 5a4093e94707bc..bfa3f1bff694ad 100644 --- a/Lib/test/test_asyncio/test_timeouts.py +++ b/Lib/test/test_asyncio/test_timeouts.py @@ -6,11 +6,12 @@ import asyncio from asyncio import tasks +from test.test_asyncio.utils import await_without_task + def tearDownModule(): asyncio.set_event_loop_policy(None) - class TimeoutTests(unittest.IsolatedAsyncioTestCase): async def test_timeout_basic(self): @@ -258,6 +259,51 @@ async def test_timeout_exception_cause (self): cause = exc.exception.__cause__ assert isinstance(cause, asyncio.CancelledError) + async def test_timeout_already_entered(self): + async with asyncio.timeout(0.01) as cm: + with self.assertRaisesRegex(RuntimeError, "has already been entered"): + async with cm: + pass + + async def test_timeout_double_enter(self): + async with asyncio.timeout(0.01) as cm: + pass + with self.assertRaisesRegex(RuntimeError, "has already been entered"): + async with cm: + pass + + async def test_timeout_finished(self): + async with asyncio.timeout(0.01) as cm: + pass + with self.assertRaisesRegex(RuntimeError, "finished"): + cm.reschedule(0.02) + + async def test_timeout_expired(self): + with self.assertRaises(TimeoutError): + async with asyncio.timeout(0.01) as cm: + await asyncio.sleep(1) + with self.assertRaisesRegex(RuntimeError, "expired"): + cm.reschedule(0.02) + + async def test_timeout_expiring(self): + async with asyncio.timeout(0.01) as cm: + with self.assertRaises(asyncio.CancelledError): + await asyncio.sleep(1) + with self.assertRaisesRegex(RuntimeError, "expiring"): + cm.reschedule(0.02) + + async def test_timeout_not_entered(self): + cm = asyncio.timeout(0.01) + with self.assertRaisesRegex(RuntimeError, "has not been entered"): + cm.reschedule(0.02) + + async def test_timeout_without_task(self): + cm = asyncio.timeout(0.01) + with self.assertRaisesRegex(RuntimeError, "task"): + await await_without_task(cm.__aenter__()) + with self.assertRaisesRegex(RuntimeError, "has not been entered"): + cm.reschedule(0.02) + if __name__ == '__main__': unittest.main() diff --git a/Lib/test/test_asyncio/utils.py b/Lib/test/test_asyncio/utils.py index d6f60db10f9b3f..7940855b19efed 100644 --- a/Lib/test/test_asyncio/utils.py +++ b/Lib/test/test_asyncio/utils.py @@ -612,3 +612,18 @@ def mock_nonblocking_socket(proto=socket.IPPROTO_TCP, type=socket.SOCK_STREAM, sock.family = family sock.gettimeout.return_value = 0.0 return sock + + +async def await_without_task(coro): + exc = None + def func(): + try: + for _ in coro.__await__(): + pass + except BaseException as err: + nonlocal exc + exc = err + asyncio.get_running_loop().call_soon(func) + await asyncio.sleep(0) + if exc is not None: + raise exc diff --git a/Misc/NEWS.d/next/Library/2023-10-20-15-29-10.gh-issue-110910.u2oPwX.rst b/Misc/NEWS.d/next/Library/2023-10-20-15-29-10.gh-issue-110910.u2oPwX.rst new file mode 100644 index 00000000000000..c750447e9fe4a5 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-10-20-15-29-10.gh-issue-110910.u2oPwX.rst @@ -0,0 +1,3 @@ +Fix invalid state handling in :class:`asyncio.TaskGroup` and +:class:`asyncio.Timeout`. They now raise proper RuntimeError if they are +improperly used and are left in consistent state after this. From d0502a9c6786474a56d35744d3d803908df0b794 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 22 Oct 2023 14:03:39 +0200 Subject: [PATCH 117/265] [3.11] gh-101100: Fix Sphinx warning in `tutorial/introduction.rst` (GH-111173) (#111176) gh-101100: Fix Sphinx warning in `tutorial/introduction.rst` (GH-111173) (cherry picked from commit 663cf513b0e973ab7aa4a8609d6616ad2c283f22) Co-authored-by: Maciej Olko --- Doc/tools/.nitignore | 1 - Doc/tutorial/introduction.rst | 2 +- 2 files changed, 1 insertion(+), 2 deletions(-) diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index d42292aed04ea4..08b9f6d688c587 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -144,7 +144,6 @@ Doc/reference/expressions.rst Doc/reference/import.rst Doc/reference/simple_stmts.rst Doc/tutorial/datastructures.rst -Doc/tutorial/introduction.rst Doc/using/windows.rst Doc/whatsnew/2.0.rst Doc/whatsnew/2.1.rst diff --git a/Doc/tutorial/introduction.rst b/Doc/tutorial/introduction.rst index b3b8e423a09fa1..731655a5885d14 100644 --- a/Doc/tutorial/introduction.rst +++ b/Doc/tutorial/introduction.rst @@ -428,7 +428,7 @@ type, i.e. it is possible to change their content:: [1, 8, 27, 64, 125] You can also add new items at the end of the list, by using -the :meth:`~list.append` *method* (we will see more about methods later):: +the :meth:`!list.append` *method* (we will see more about methods later):: >>> cubes.append(216) # add the cube of 6 >>> cubes.append(7 ** 3) # and the cube of 7 From aaa755dd48ac35a392bad211b21bef8808f81719 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 22 Oct 2023 19:29:06 +0200 Subject: [PATCH 118/265] [3.11] gh-101100: Fix sphinx warnings in `library/asyncio-dev.rst` (GH-111179) (#111186) gh-101100: Fix sphinx warnings in `library/asyncio-dev.rst` (GH-111179) * gh-101100: Fix sphinx warnings in `library/asyncio-dev.rst` * Update Doc/library/asyncio-eventloop.rst * Update Doc/library/asyncio-eventloop.rst --------- (cherry picked from commit 8c689c9b88426384a9736c708701923a1ab1da79) Co-authored-by: Nikita Sobolev Co-authored-by: Carol Willing --- Doc/library/asyncio-eventloop.rst | 14 +++++++++++--- Doc/tools/.nitignore | 1 - 2 files changed, 11 insertions(+), 4 deletions(-) diff --git a/Doc/library/asyncio-eventloop.rst b/Doc/library/asyncio-eventloop.rst index 07be0d0eea0888..aeebc3bae85e13 100644 --- a/Doc/library/asyncio-eventloop.rst +++ b/Doc/library/asyncio-eventloop.rst @@ -238,9 +238,9 @@ Scheduling callbacks See the :ref:`concurrency and multithreading ` section of the documentation. -.. versionchanged:: 3.7 - The *context* keyword-only parameter was added. See :pep:`567` - for more details. + .. versionchanged:: 3.7 + The *context* keyword-only parameter was added. See :pep:`567` + for more details. .. _asyncio-pass-keywords: @@ -1365,6 +1365,14 @@ Enabling debug mode The new :ref:`Python Development Mode ` can now also be used to enable the debug mode. +.. attribute:: loop.slow_callback_duration + + This attribute can be used to set the + minimum execution duration in seconds that is considered "slow". + When debug mode is enabled, "slow" callbacks are logged. + + Default value is 100 milliseconds. + .. seealso:: The :ref:`debug mode of asyncio `. diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index 08b9f6d688c587..8fd7bcab8b2dff 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -31,7 +31,6 @@ Doc/howto/urllib2.rst Doc/library/__future__.rst Doc/library/abc.rst Doc/library/ast.rst -Doc/library/asyncio-dev.rst Doc/library/asyncio-eventloop.rst Doc/library/asyncio-extending.rst Doc/library/asyncio-policy.rst From 3c3c489d41f3bca783d48bd6bbc70333093d9a99 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 22 Oct 2023 21:12:58 +0200 Subject: [PATCH 119/265] [3.11] gh-110196: Fix ipaddress.IPv6Address.__reduce__ (GH-110198) (GH-111190) (cherry picked from commit 767f416feb551f495bacfff1e9ba1e6672c2f24e) Co-authored-by: Tian Gao --- Lib/ipaddress.py | 3 +++ Lib/test/test_ipaddress.py | 7 +++++++ .../Library/2023-10-02-05-23-27.gh-issue-110196.djwt0z.rst | 1 + 3 files changed, 11 insertions(+) create mode 100644 Misc/NEWS.d/next/Library/2023-10-02-05-23-27.gh-issue-110196.djwt0z.rst diff --git a/Lib/ipaddress.py b/Lib/ipaddress.py index 1cb71d8032e173..16ba16cd7de49a 100644 --- a/Lib/ipaddress.py +++ b/Lib/ipaddress.py @@ -1941,6 +1941,9 @@ def __eq__(self, other): return False return self._scope_id == getattr(other, '_scope_id', None) + def __reduce__(self): + return (self.__class__, (str(self),)) + @property def scope_id(self): """Identifier of a particular zone of the address's scope. diff --git a/Lib/test/test_ipaddress.py b/Lib/test/test_ipaddress.py index a5388b2e5debd8..fc27628af17f8d 100644 --- a/Lib/test/test_ipaddress.py +++ b/Lib/test/test_ipaddress.py @@ -4,6 +4,7 @@ """Unittest for ipaddress module.""" +import copy import unittest import re import contextlib @@ -542,11 +543,17 @@ def assertBadPart(addr, part): def test_pickle(self): self.pickle_test('2001:db8::') + self.pickle_test('2001:db8::%scope') def test_weakref(self): weakref.ref(self.factory('2001:db8::')) weakref.ref(self.factory('2001:db8::%scope')) + def test_copy(self): + addr = self.factory('2001:db8::%scope') + self.assertEqual(addr, copy.copy(addr)) + self.assertEqual(addr, copy.deepcopy(addr)) + class NetmaskTestMixin_v4(CommonTestMixin_v4): """Input validation on interfaces and networks is very similar""" diff --git a/Misc/NEWS.d/next/Library/2023-10-02-05-23-27.gh-issue-110196.djwt0z.rst b/Misc/NEWS.d/next/Library/2023-10-02-05-23-27.gh-issue-110196.djwt0z.rst new file mode 100644 index 00000000000000..341f3380fffd60 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-10-02-05-23-27.gh-issue-110196.djwt0z.rst @@ -0,0 +1 @@ +Add ``__reduce__`` method to :class:`IPv6Address` in order to keep ``scope_id`` From 6020a3e736e83f82048fd46dad8719f738f24cdd Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 23 Oct 2023 09:31:32 +0200 Subject: [PATCH 120/265] [3.11] gh-110383: Added explanation about simplest regex use case for quantifiers. (GH-111110) (#111205) Co-authored-by: Nick Co-authored-by: Hugo van Kemenade --- Doc/howto/regex.rst | 3 +++ 1 file changed, 3 insertions(+) diff --git a/Doc/howto/regex.rst b/Doc/howto/regex.rst index 8d95d86ba398b6..2cc17b4f745cb6 100644 --- a/Doc/howto/regex.rst +++ b/Doc/howto/regex.rst @@ -245,6 +245,9 @@ You can omit either *m* or *n*; in that case, a reasonable value is assumed for the missing value. Omitting *m* is interpreted as a lower limit of 0, while omitting *n* results in an upper bound of infinity. +The simplest case ``{m}`` matches the preceding item exactly **m** times. +For example, ``a/{2}b`` will only match ``'a//b'``. + Readers of a reductionist bent may notice that the three other quantifiers can all be expressed using this notation. ``{0,}`` is the same as ``*``, ``{1,}`` is equivalent to ``+``, and ``{0,1}`` is the same as ``?``. It's better to use From 0a23960266fe486c53bca9bc569709dacd3e03c2 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 23 Oct 2023 10:11:54 +0200 Subject: [PATCH 121/265] [3.11] gh-110383: Italicize variable name (GH-111206) (#111208) Co-authored-by: Nick --- Doc/howto/regex.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/howto/regex.rst b/Doc/howto/regex.rst index 2cc17b4f745cb6..b69d19e212bdc9 100644 --- a/Doc/howto/regex.rst +++ b/Doc/howto/regex.rst @@ -245,7 +245,7 @@ You can omit either *m* or *n*; in that case, a reasonable value is assumed for the missing value. Omitting *m* is interpreted as a lower limit of 0, while omitting *n* results in an upper bound of infinity. -The simplest case ``{m}`` matches the preceding item exactly **m** times. +The simplest case ``{m}`` matches the preceding item exactly *m* times. For example, ``a/{2}b`` will only match ``'a//b'``. Readers of a reductionist bent may notice that the three other quantifiers can From f446df741f58488da198a265bc807712b3a66a98 Mon Sep 17 00:00:00 2001 From: Furkan Onder Date: Mon, 23 Oct 2023 12:49:44 +0300 Subject: [PATCH 122/265] [3.11] gh-67565: Add tests for C-contiguity checks (GH-110951) (GH-111199) (cherry picked from commit 9376728ce45191fcc0b908c7487ad7985454537e) --- Lib/test/test_binascii.py | 6 ++++++ Lib/test/test_capi/test_getargs.py | 19 +++++++++++++++++++ Lib/test/test_ssl.py | 4 ++++ 3 files changed, 29 insertions(+) diff --git a/Lib/test/test_binascii.py b/Lib/test/test_binascii.py index 7087d7a471d3f4..6fb57c3166d2b6 100644 --- a/Lib/test/test_binascii.py +++ b/Lib/test/test_binascii.py @@ -428,6 +428,12 @@ def test_b2a_base64_newline(self): self.assertEqual(binascii.b2a_base64(b, newline=False), b'aGVsbG8=') + def test_c_contiguity(self): + m = memoryview(bytearray(b'noncontig')) + noncontig_writable = m[::-2] + with self.assertRaises(BufferError): + binascii.b2a_hex(noncontig_writable) + class ArrayBinASCIITest(BinASCIITest): def type2test(self, s): diff --git a/Lib/test/test_capi/test_getargs.py b/Lib/test/test_capi/test_getargs.py index 552fa213524723..27675533d1bb4b 100644 --- a/Lib/test/test_capi/test_getargs.py +++ b/Lib/test/test_capi/test_getargs.py @@ -152,6 +152,8 @@ class TupleSubclass(tuple): class DictSubclass(dict): pass +NONCONTIG_WRITABLE = memoryview(bytearray(b'noncontig'))[::-2] +NONCONTIG_READONLY = memoryview(b'noncontig')[::-2] class Unsigned_TestCase(unittest.TestCase): def test_b(self): @@ -836,6 +838,8 @@ def test_y_star(self): self.assertEqual(getargs_y_star(bytearray(b'bytearray')), b'bytearray') self.assertEqual(getargs_y_star(memoryview(b'memoryview')), b'memoryview') self.assertRaises(TypeError, getargs_y_star, None) + self.assertRaises(BufferError, getargs_y_star, NONCONTIG_WRITABLE) + self.assertRaises(BufferError, getargs_y_star, NONCONTIG_READONLY) def test_y_hash(self): from _testcapi import getargs_y_hash @@ -845,6 +849,9 @@ def test_y_hash(self): self.assertRaises(TypeError, getargs_y_hash, bytearray(b'bytearray')) self.assertRaises(TypeError, getargs_y_hash, memoryview(b'memoryview')) self.assertRaises(TypeError, getargs_y_hash, None) + # TypeError: must be read-only bytes-like object, not memoryview + self.assertRaises(TypeError, getargs_y_hash, NONCONTIG_WRITABLE) + self.assertRaises(TypeError, getargs_y_hash, NONCONTIG_READONLY) def test_w_star(self): # getargs_w_star() modifies first and last byte @@ -860,6 +867,8 @@ def test_w_star(self): self.assertEqual(getargs_w_star(memoryview(buf)), b'[emoryvie]') self.assertEqual(buf, bytearray(b'[emoryvie]')) self.assertRaises(TypeError, getargs_w_star, None) + self.assertRaises(TypeError, getargs_w_star, NONCONTIG_WRITABLE) + self.assertRaises(TypeError, getargs_w_star, NONCONTIG_READONLY) class String_TestCase(unittest.TestCase): @@ -892,6 +901,8 @@ def test_s_star(self): self.assertEqual(getargs_s_star(bytearray(b'bytearray')), b'bytearray') self.assertEqual(getargs_s_star(memoryview(b'memoryview')), b'memoryview') self.assertRaises(TypeError, getargs_s_star, None) + self.assertRaises(BufferError, getargs_s_star, NONCONTIG_WRITABLE) + self.assertRaises(BufferError, getargs_s_star, NONCONTIG_READONLY) def test_s_hash(self): from _testcapi import getargs_s_hash @@ -901,6 +912,9 @@ def test_s_hash(self): self.assertRaises(TypeError, getargs_s_hash, bytearray(b'bytearray')) self.assertRaises(TypeError, getargs_s_hash, memoryview(b'memoryview')) self.assertRaises(TypeError, getargs_s_hash, None) + # TypeError: must be read-only bytes-like object, not memoryview + self.assertRaises(TypeError, getargs_s_hash, NONCONTIG_WRITABLE) + self.assertRaises(TypeError, getargs_s_hash, NONCONTIG_READONLY) def test_s_hash_int(self): # "s#" without PY_SSIZE_T_CLEAN defined. @@ -936,6 +950,8 @@ def test_z_star(self): self.assertEqual(getargs_z_star(bytearray(b'bytearray')), b'bytearray') self.assertEqual(getargs_z_star(memoryview(b'memoryview')), b'memoryview') self.assertIsNone(getargs_z_star(None)) + self.assertRaises(BufferError, getargs_z_star, NONCONTIG_WRITABLE) + self.assertRaises(BufferError, getargs_z_star, NONCONTIG_READONLY) def test_z_hash(self): from _testcapi import getargs_z_hash @@ -945,6 +961,9 @@ def test_z_hash(self): self.assertRaises(TypeError, getargs_z_hash, bytearray(b'bytearray')) self.assertRaises(TypeError, getargs_z_hash, memoryview(b'memoryview')) self.assertIsNone(getargs_z_hash(None)) + # TypeError: must be read-only bytes-like object, not memoryview + self.assertRaises(TypeError, getargs_z_hash, NONCONTIG_WRITABLE) + self.assertRaises(TypeError, getargs_z_hash, NONCONTIG_READONLY) def test_es(self): from _testcapi import getargs_es diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py index c98f7679a9f06e..b353f3265cbb9e 100644 --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -1952,6 +1952,10 @@ def test_buffer_types(self): self.assertEqual(bio.read(), b'bar') bio.write(memoryview(b'baz')) self.assertEqual(bio.read(), b'baz') + m = memoryview(bytearray(b'noncontig')) + noncontig_writable = m[::-2] + with self.assertRaises(BufferError): + bio.write(memoryview(noncontig_writable)) def test_error_types(self): bio = ssl.MemoryBIO() From 135d5c58405fd4b4d6310206f2b63e3da7b1e69c Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 23 Oct 2023 17:49:17 +0200 Subject: [PATCH 123/265] [3.11] gh-106310 - document the __signature__ attribute (GH-106311) (#111146) Co-authored-by: Gouvernathor <44340603+Gouvernathor@users.noreply.github.com> Co-authored-by: Alex Waygood --- Doc/library/inspect.rst | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/Doc/library/inspect.rst b/Doc/library/inspect.rst index 22637a051d9fa6..e95196376bb24b 100644 --- a/Doc/library/inspect.rst +++ b/Doc/library/inspect.rst @@ -619,6 +619,9 @@ function. Accepts a wide range of Python callables, from plain functions and classes to :func:`functools.partial` objects. + If the passed object has a ``__signature__`` attribute, this function + returns it without further computations. + For objects defined in modules using stringized annotations (``from __future__ import annotations``), :func:`signature` will attempt to automatically un-stringize the annotations using @@ -738,6 +741,8 @@ function. sig = MySignature.from_callable(min) assert isinstance(sig, MySignature) + Its behavior is otherwise identical to that of :func:`signature`. + .. versionadded:: 3.5 .. versionadded:: 3.10 From a449a70bcbd9fd0eeccfad1ff852ede03978d691 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 23 Oct 2023 18:04:52 +0200 Subject: [PATCH 124/265] [3.11] typo: missing line of output in pull parser example (GH-111068) (#111218) Co-authored-by: Don Patterson <37046246+don-patterson@users.noreply.github.com> --- Doc/library/xml.etree.elementtree.rst | 1 + 1 file changed, 1 insertion(+) diff --git a/Doc/library/xml.etree.elementtree.rst b/Doc/library/xml.etree.elementtree.rst index 76c5b89ac66316..7fd4c7a4a5da2b 100644 --- a/Doc/library/xml.etree.elementtree.rst +++ b/Doc/library/xml.etree.elementtree.rst @@ -154,6 +154,7 @@ elements, call :meth:`XMLPullParser.read_events`. Here is an example:: ... print(elem.tag, 'text=', elem.text) ... end + mytag text= sometext more text The obvious use case is applications that operate in a non-blocking fashion where the XML data is being received from a socket or read incrementally from From 09bd752d9461b9104870ccc843b6e07b2f5a1467 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 23 Oct 2023 18:09:56 +0200 Subject: [PATCH 125/265] [3.11] Add a version added note for PY_VECTORCALL_ARGUMENTS_OFFSET (GH-110963) (#111220) Co-authored-by: Anthony Shaw --- Doc/c-api/call.rst | 2 ++ 1 file changed, 2 insertions(+) diff --git a/Doc/c-api/call.rst b/Doc/c-api/call.rst index 838af604dca020..df17c2207e94fd 100644 --- a/Doc/c-api/call.rst +++ b/Doc/c-api/call.rst @@ -99,6 +99,8 @@ This is a pointer to a function with the following signature: Doing so will allow callables such as bound methods to make their onward calls (which include a prepended *self* argument) very efficiently. + .. versionadded:: 3.8 + To call an object that implements vectorcall, use a :ref:`call API ` function as with any other callable. :c:func:`PyObject_Vectorcall` will usually be most efficient. From bf41dcda703210c1590bca60032636d526487a95 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 23 Oct 2023 21:18:00 +0200 Subject: [PATCH 126/265] [3.11] gh-101100: Fix Sphinx warnings for `fileno` (GH-111118) (#111227) Co-authored-by: Hugo van Kemenade --- Doc/library/bz2.rst | 48 +++++++++++++++++++++++++++++---- Doc/library/mmap.rst | 2 +- Doc/library/multiprocessing.rst | 2 +- Doc/library/selectors.rst | 2 +- Doc/library/socket.rst | 2 +- Doc/library/tempfile.rst | 2 +- Doc/tools/.nitignore | 2 -- Doc/whatsnew/2.5.rst | 2 +- 8 files changed, 49 insertions(+), 13 deletions(-) diff --git a/Doc/library/bz2.rst b/Doc/library/bz2.rst index d03f2c4f0ca97a..51b1da89683ead 100644 --- a/Doc/library/bz2.rst +++ b/Doc/library/bz2.rst @@ -91,7 +91,7 @@ The :mod:`bz2` module contains: and :meth:`~io.IOBase.truncate`. Iteration and the :keyword:`with` statement are supported. - :class:`BZ2File` also provides the following method: + :class:`BZ2File` also provides the following methods: .. method:: peek([n]) @@ -106,14 +106,52 @@ The :mod:`bz2` module contains: .. versionadded:: 3.3 + .. method:: fileno() + + Return the file descriptor for the underlying file. + + .. versionadded:: 3.3 + + .. method:: readable() + + Return whether the file was opened for reading. + + .. versionadded:: 3.3 + + .. method:: seekable() + + Return whether the file supports seeking. + + .. versionadded:: 3.3 + + .. method:: writable() + + Return whether the file was opened for writing. + + .. versionadded:: 3.3 + + .. method:: read1(size=-1) + + Read up to *size* uncompressed bytes, while trying to avoid + making multiple reads from the underlying stream. Reads up to a + buffer's worth of data if size is negative. + + Returns ``b''`` if the file is at EOF. + + .. versionadded:: 3.3 + + .. method:: readinto(b) + + Read bytes into *b*. + + Returns the number of bytes read (0 for EOF). + + .. versionadded:: 3.3 + .. versionchanged:: 3.1 Support for the :keyword:`with` statement was added. - .. versionchanged:: 3.3 - The :meth:`fileno`, :meth:`readable`, :meth:`seekable`, :meth:`writable`, - :meth:`read1` and :meth:`readinto` methods were added. - .. versionchanged:: 3.3 Support was added for *filename* being a :term:`file object` instead of an actual filename. diff --git a/Doc/library/mmap.rst b/Doc/library/mmap.rst index c4f8781f2ac993..21fa97a2a3cef8 100644 --- a/Doc/library/mmap.rst +++ b/Doc/library/mmap.rst @@ -19,7 +19,7 @@ the current file position, and :meth:`seek` through the file to different positi A memory-mapped file is created by the :class:`~mmap.mmap` constructor, which is different on Unix and on Windows. In either case you must provide a file descriptor for a file opened for update. If you wish to map an existing Python -file object, use its :meth:`fileno` method to obtain the correct value for the +file object, use its :meth:`~io.IOBase.fileno` method to obtain the correct value for the *fileno* parameter. Otherwise, you can open the file using the :func:`os.open` function, which returns a file descriptor directly (the file still needs to be closed when done). diff --git a/Doc/library/multiprocessing.rst b/Doc/library/multiprocessing.rst index 2162328be9adb2..99dda5b5768fbd 100644 --- a/Doc/library/multiprocessing.rst +++ b/Doc/library/multiprocessing.rst @@ -2540,7 +2540,7 @@ multiple connections at the same time. **Windows**: An item in *object_list* must either be an integer handle which is waitable (according to the definition used by the documentation of the Win32 function ``WaitForMultipleObjects()``) - or it can be an object with a :meth:`fileno` method which returns a + or it can be an object with a :meth:`~io.IOBase.fileno` method which returns a socket handle or pipe handle. (Note that pipe handles and socket handles are **not** waitable handles.) diff --git a/Doc/library/selectors.rst b/Doc/library/selectors.rst index dd50bac37e49b8..76cbf91412f763 100644 --- a/Doc/library/selectors.rst +++ b/Doc/library/selectors.rst @@ -21,7 +21,7 @@ It defines a :class:`BaseSelector` abstract base class, along with several concrete implementations (:class:`KqueueSelector`, :class:`EpollSelector`...), that can be used to wait for I/O readiness notification on multiple file objects. In the following, "file object" refers to any object with a -:meth:`fileno()` method, or a raw file descriptor. See :term:`file object`. +:meth:`~io.IOBase.fileno` method, or a raw file descriptor. See :term:`file object`. :class:`DefaultSelector` is an alias to the most efficient implementation available on the current platform: this should be the default choice for most diff --git a/Doc/library/socket.rst b/Doc/library/socket.rst index c7274f59f43d1a..7193713984f169 100644 --- a/Doc/library/socket.rst +++ b/Doc/library/socket.rst @@ -759,7 +759,7 @@ The following functions all create :ref:`socket objects `. .. function:: fromfd(fd, family, type, proto=0) Duplicate the file descriptor *fd* (an integer as returned by a file object's - :meth:`fileno` method) and build a socket object from the result. Address + :meth:`~io.IOBase.fileno` method) and build a socket object from the result. Address family, socket type and protocol number are as for the :func:`.socket` function above. The file descriptor should refer to a socket, but this is not checked --- subsequent operations on the object may fail if the file descriptor is invalid. diff --git a/Doc/library/tempfile.rst b/Doc/library/tempfile.rst index 84a6f8887954cc..54096f083fd98b 100644 --- a/Doc/library/tempfile.rst +++ b/Doc/library/tempfile.rst @@ -103,7 +103,7 @@ The module defines the following user-callable items: This class operates exactly as :func:`TemporaryFile` does, except that data is spooled in memory until the file size exceeds *max_size*, or - until the file's :func:`fileno` method is called, at which point the + until the file's :func:`~io.IOBase.fileno` method is called, at which point the contents are written to disk and operation proceeds as with :func:`TemporaryFile`. diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index 8fd7bcab8b2dff..5dc8bdc61f6ce5 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -39,7 +39,6 @@ Doc/library/asyncio-subprocess.rst Doc/library/asyncio-task.rst Doc/library/bdb.rst Doc/library/bisect.rst -Doc/library/bz2.rst Doc/library/calendar.rst Doc/library/cmd.rst Doc/library/collections.abc.rst @@ -104,7 +103,6 @@ Doc/library/reprlib.rst Doc/library/resource.rst Doc/library/rlcompleter.rst Doc/library/select.rst -Doc/library/selectors.rst Doc/library/shelve.rst Doc/library/signal.rst Doc/library/smtplib.rst diff --git a/Doc/whatsnew/2.5.rst b/Doc/whatsnew/2.5.rst index ad0931ecbed060..3608153db073a6 100644 --- a/Doc/whatsnew/2.5.rst +++ b/Doc/whatsnew/2.5.rst @@ -1347,7 +1347,7 @@ complete list of changes, or look through the SVN logs for all the details. :func:`input` function to allow opening files in binary or :term:`universal newlines` mode. Another new parameter, *openhook*, lets you use a function other than :func:`open` to open the input files. Once you're iterating over - the set of files, the :class:`FileInput` object's new :meth:`fileno` returns + the set of files, the :class:`FileInput` object's new :meth:`~fileinput.fileno` returns the file descriptor for the currently opened file. (Contributed by Georg Brandl.) From e35393fde91fba1157b57df0c0e0501b72705085 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 24 Oct 2023 08:35:49 +0200 Subject: [PATCH 127/265] [3.11] Fix a code snippet typo in asyncio docs (GH-108427) (#111244) Co-authored-by: A <5249513+Dumeng@users.noreply.github.com> --- Doc/library/asyncio-task.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/library/asyncio-task.rst b/Doc/library/asyncio-task.rst index 0944aef0ab619d..858c2996b87695 100644 --- a/Doc/library/asyncio-task.rst +++ b/Doc/library/asyncio-task.rst @@ -539,7 +539,7 @@ Shielding From Cancellation is equivalent to:: - res = await something() + res = await shield(something()) *except* that if the coroutine containing it is cancelled, the Task running in ``something()`` is not cancelled. From the point From 10376a164f31409e8cedb140178d4670fb44f8c1 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 24 Oct 2023 10:15:03 +0200 Subject: [PATCH 128/265] [3.11] Fix typo in sys docs (GH-111196) (#111249) Co-authored-by: James Tocknell Co-authored-by: Hugo van Kemenade --- Doc/library/sys.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/library/sys.rst b/Doc/library/sys.rst index 65c20748f862b5..064a9c47137b30 100644 --- a/Doc/library/sys.rst +++ b/Doc/library/sys.rst @@ -1714,7 +1714,7 @@ always available. However, if you are writing a library (and do not control in which context its code will be executed), be aware that the standard streams may be replaced with file-like objects like :class:`io.StringIO` which - do not support the :attr:!buffer` attribute. + do not support the :attr:`!buffer` attribute. .. data:: __stdin__ From c905fab33871df2c250f1e37760f6d541672a566 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 24 Oct 2023 13:53:38 +0200 Subject: [PATCH 129/265] [3.11] gh-75666: Tkinter: add tests for binding (GH-111202) (GH-111256) (cherry picked from commit 9bb202a1a90ef0edce20c495c9426d9766df11bb) Co-authored-by: Serhiy Storchaka --- Lib/tkinter/test/test_tkinter/test_misc.py | 307 +++++++++++++++++++++ 1 file changed, 307 insertions(+) diff --git a/Lib/tkinter/test/test_tkinter/test_misc.py b/Lib/tkinter/test/test_tkinter/test_misc.py index 620b6ed638c25a..c772bb804639b8 100644 --- a/Lib/tkinter/test/test_tkinter/test_misc.py +++ b/Lib/tkinter/test/test_tkinter/test_misc.py @@ -371,6 +371,309 @@ def test_info_patchlevel(self): self.assertTrue(str(vi).startswith(f'{vi.major}.{vi.minor}')) +class BindTest(AbstractTkTest, unittest.TestCase): + + def setUp(self): + super().setUp() + root = self.root + self.frame = tkinter.Frame(self.root, class_='Test', + width=150, height=100) + self.frame.pack() + + def assertCommandExist(self, funcid): + self.assertEqual(_info_commands(self.root, funcid), (funcid,)) + + def assertCommandNotExist(self, funcid): + self.assertEqual(_info_commands(self.root, funcid), ()) + + def test_bind(self): + event = '' + f = self.frame + self.assertEqual(f.bind(), ()) + self.assertEqual(f.bind(event), '') + def test1(e): pass + def test2(e): pass + + funcid = f.bind(event, test1) + self.assertEqual(f.bind(), (event,)) + script = f.bind(event) + self.assertIn(funcid, script) + self.assertCommandExist(funcid) + + funcid2 = f.bind(event, test2, add=True) + script = f.bind(event) + self.assertIn(funcid, script) + self.assertIn(funcid2, script) + self.assertCommandExist(funcid) + self.assertCommandExist(funcid2) + + def test_unbind(self): + event = '' + f = self.frame + self.assertEqual(f.bind(), ()) + self.assertEqual(f.bind(event), '') + def test1(e): pass + def test2(e): pass + + funcid = f.bind(event, test1) + funcid2 = f.bind(event, test2, add=True) + + self.assertRaises(TypeError, f.unbind) + f.unbind(event) + self.assertEqual(f.bind(event), '') + self.assertEqual(f.bind(), ()) + + def test_unbind2(self): + f = self.frame + event = '' + self.assertEqual(f.bind(), ()) + self.assertEqual(f.bind(event), '') + def test1(e): pass + def test2(e): pass + + funcid = f.bind(event, test1) + funcid2 = f.bind(event, test2, add=True) + + f.unbind(event, funcid) + script = f.bind(event) + self.assertNotIn(funcid, script) + self.assertCommandNotExist(funcid) + self.assertCommandExist(funcid2) + + f.unbind(event, funcid2) + self.assertEqual(f.bind(event), '') + self.assertEqual(f.bind(), ()) + self.assertCommandNotExist(funcid) + self.assertCommandNotExist(funcid2) + + # non-idempotent + self.assertRaises(tkinter.TclError, f.unbind, event, funcid2) + + def test_bind_rebind(self): + event = '' + f = self.frame + self.assertEqual(f.bind(), ()) + self.assertEqual(f.bind(event), '') + def test1(e): pass + def test2(e): pass + def test3(e): pass + + funcid = f.bind(event, test1) + funcid2 = f.bind(event, test2, add=True) + script = f.bind(event) + self.assertIn(funcid2, script) + self.assertIn(funcid, script) + self.assertCommandExist(funcid) + self.assertCommandExist(funcid2) + + funcid3 = f.bind(event, test3) + script = f.bind(event) + self.assertNotIn(funcid, script) + self.assertNotIn(funcid2, script) + self.assertIn(funcid3, script) + self.assertCommandExist(funcid3) + + def test_bind_class(self): + event = '' + bind_class = self.root.bind_class + unbind_class = self.root.unbind_class + self.assertRaises(TypeError, bind_class) + self.assertEqual(bind_class('Test'), ()) + self.assertEqual(bind_class('Test', event), '') + self.addCleanup(unbind_class, 'Test', event) + def test1(e): pass + def test2(e): pass + + funcid = bind_class('Test', event, test1) + self.assertEqual(bind_class('Test'), (event,)) + script = bind_class('Test', event) + self.assertIn(funcid, script) + self.assertCommandExist(funcid) + + funcid2 = bind_class('Test', event, test2, add=True) + script = bind_class('Test', event) + self.assertIn(funcid, script) + self.assertIn(funcid2, script) + self.assertCommandExist(funcid) + self.assertCommandExist(funcid2) + + def test_unbind_class(self): + event = '' + bind_class = self.root.bind_class + unbind_class = self.root.unbind_class + self.assertEqual(bind_class('Test'), ()) + self.assertEqual(bind_class('Test', event), '') + self.addCleanup(unbind_class, 'Test', event) + def test1(e): pass + def test2(e): pass + + funcid = bind_class('Test', event, test1) + funcid2 = bind_class('Test', event, test2, add=True) + + self.assertRaises(TypeError, unbind_class) + self.assertRaises(TypeError, unbind_class, 'Test') + unbind_class('Test', event) + self.assertEqual(bind_class('Test', event), '') + self.assertEqual(bind_class('Test'), ()) + self.assertCommandExist(funcid) + self.assertCommandExist(funcid2) + + unbind_class('Test', event) # idempotent + + def test_bind_class_rebind(self): + event = '' + bind_class = self.root.bind_class + unbind_class = self.root.unbind_class + self.assertEqual(bind_class('Test'), ()) + self.assertEqual(bind_class('Test', event), '') + self.addCleanup(unbind_class, 'Test', event) + def test1(e): pass + def test2(e): pass + def test3(e): pass + + funcid = bind_class('Test', event, test1) + funcid2 = bind_class('Test', event, test2, add=True) + script = bind_class('Test', event) + self.assertIn(funcid2, script) + self.assertIn(funcid, script) + self.assertCommandExist(funcid) + self.assertCommandExist(funcid2) + + funcid3 = bind_class('Test', event, test3) + script = bind_class('Test', event) + self.assertNotIn(funcid, script) + self.assertNotIn(funcid2, script) + self.assertIn(funcid3, script) + self.assertCommandExist(funcid) + self.assertCommandExist(funcid2) + self.assertCommandExist(funcid3) + + def test_bind_all(self): + event = '' + bind_all = self.root.bind_all + unbind_all = self.root.unbind_all + self.assertNotIn(event, bind_all()) + self.assertEqual(bind_all(event), '') + self.addCleanup(unbind_all, event) + def test1(e): pass + def test2(e): pass + + funcid = bind_all(event, test1) + self.assertIn(event, bind_all()) + script = bind_all(event) + self.assertIn(funcid, script) + self.assertCommandExist(funcid) + + funcid2 = bind_all(event, test2, add=True) + script = bind_all(event) + self.assertIn(funcid, script) + self.assertIn(funcid2, script) + self.assertCommandExist(funcid) + self.assertCommandExist(funcid2) + + def test_unbind_all(self): + event = '' + bind_all = self.root.bind_all + unbind_all = self.root.unbind_all + self.assertNotIn(event, bind_all()) + self.assertEqual(bind_all(event), '') + self.addCleanup(unbind_all, event) + def test1(e): pass + def test2(e): pass + + funcid = bind_all(event, test1) + funcid2 = bind_all(event, test2, add=True) + + unbind_all(event) + self.assertEqual(bind_all(event), '') + self.assertNotIn(event, bind_all()) + self.assertCommandExist(funcid) + self.assertCommandExist(funcid2) + + unbind_all(event) # idempotent + + def test_bind_all_rebind(self): + event = '' + bind_all = self.root.bind_all + unbind_all = self.root.unbind_all + self.assertNotIn(event, bind_all()) + self.assertEqual(bind_all(event), '') + self.addCleanup(unbind_all, event) + def test1(e): pass + def test2(e): pass + def test3(e): pass + + funcid = bind_all(event, test1) + funcid2 = bind_all(event, test2, add=True) + script = bind_all(event) + self.assertIn(funcid2, script) + self.assertIn(funcid, script) + self.assertCommandExist(funcid) + self.assertCommandExist(funcid2) + + funcid3 = bind_all(event, test3) + script = bind_all(event) + self.assertNotIn(funcid, script) + self.assertNotIn(funcid2, script) + self.assertIn(funcid3, script) + self.assertCommandExist(funcid) + self.assertCommandExist(funcid2) + self.assertCommandExist(funcid3) + + def test_bindtags(self): + f = self.frame + self.assertEqual(self.root.bindtags(), ('.', 'Tk', 'all')) + self.assertEqual(f.bindtags(), (str(f), 'Test', '.', 'all')) + f.bindtags(('a', 'b c')) + self.assertEqual(f.bindtags(), ('a', 'b c')) + + def test_bind_events(self): + event = '' + root = self.root + t = tkinter.Toplevel(root) + f = tkinter.Frame(t, class_='Test', width=150, height=100) + f.pack() + root.wait_visibility() # needed on Windows + root.update_idletasks() + self.addCleanup(root.unbind_class, 'Test', event) + self.addCleanup(root.unbind_class, 'Toplevel', event) + self.addCleanup(root.unbind_class, 'tag', event) + self.addCleanup(root.unbind_class, 'tag2', event) + self.addCleanup(root.unbind_all, event) + def test(what): + return lambda e: events.append((what, e.widget)) + + root.bind_all(event, test('all')) + root.bind_class('Test', event, test('frame class')) + root.bind_class('Toplevel', event, test('toplevel class')) + root.bind_class('tag', event, test('tag')) + root.bind_class('tag2', event, test('tag2')) + f.bind(event, test('frame')) + t.bind(event, test('toplevel')) + + events = [] + f.event_generate(event) + self.assertEqual(events, [ + ('frame', f), + ('frame class', f), + ('toplevel', f), + ('all', f), + ]) + + events = [] + t.event_generate(event) + self.assertEqual(events, [ + ('toplevel', t), + ('toplevel class', t), + ('all', t), + ]) + + f.bindtags(('tag', 'tag3')) + events = [] + f.event_generate(event) + self.assertEqual(events, [('tag', f)]) + + class DefaultRootTest(AbstractDefaultRootTest, unittest.TestCase): def test_default_root(self): @@ -426,5 +729,9 @@ def test_mainloop(self): self.assertRaises(RuntimeError, tkinter.mainloop) +def _info_commands(widget, pattern=None): + return widget.tk.splitlist(widget.tk.call('info', 'commands', pattern)) + + if __name__ == "__main__": unittest.main() From 575bff3732130e7470b0d005f36c904f8f7db92d Mon Sep 17 00:00:00 2001 From: Jelle Zijlstra Date: Tue, 24 Oct 2023 08:34:46 -0700 Subject: [PATCH 130/265] [3.11] gh-111151: Convert monospaced directives to :ref: (GH-111152) (#111270) (cherry picked from commit 1198076447f35b19a9173866ccb9839f3bcf3f17) Co-authored-by: InSync <122007197+InSyncWithFoo@users.noreply.github.com> --- Doc/library/asyncio-eventloop.rst | 6 ++++++ Doc/library/asyncio.rst | 6 +++--- Doc/library/typing.rst | 14 ++++++++++---- 3 files changed, 19 insertions(+), 7 deletions(-) diff --git a/Doc/library/asyncio-eventloop.rst b/Doc/library/asyncio-eventloop.rst index aeebc3bae85e13..5e40b50056d65a 100644 --- a/Doc/library/asyncio-eventloop.rst +++ b/Doc/library/asyncio-eventloop.rst @@ -644,6 +644,8 @@ Opening network connections Creating network servers ^^^^^^^^^^^^^^^^^^^^^^^^ +.. _loop_create_server: + .. coroutinemethod:: loop.create_server(protocol_factory, \ host=None, port=None, *, \ family=socket.AF_UNSPEC, \ @@ -1174,6 +1176,8 @@ Working with pipes Unix signals ^^^^^^^^^^^^ +.. _loop_add_signal_handler: + .. method:: loop.add_signal_handler(signum, callback, *args) Set *callback* as the handler for the *signum* signal. @@ -1393,6 +1397,8 @@ async/await code consider using the high-level :ref:`Subprocess Support on Windows ` for details. +.. _loop_subprocess_exec: + .. coroutinemethod:: loop.subprocess_exec(protocol_factory, *args, \ stdin=subprocess.PIPE, stdout=subprocess.PIPE, \ stderr=subprocess.PIPE, **kwargs) diff --git a/Doc/library/asyncio.rst b/Doc/library/asyncio.rst index c75ab47404c1e4..5f33c6813e74c0 100644 --- a/Doc/library/asyncio.rst +++ b/Doc/library/asyncio.rst @@ -46,9 +46,9 @@ Additionally, there are **low-level** APIs for *library and framework developers* to: * create and manage :ref:`event loops `, which - provide asynchronous APIs for :meth:`networking `, - running :meth:`subprocesses `, - handling :meth:`OS signals `, etc; + provide asynchronous APIs for :ref:`networking `, + running :ref:`subprocesses `, + handling :ref:`OS signals `, etc; * implement efficient protocols using :ref:`transports `; diff --git a/Doc/library/typing.rst b/Doc/library/typing.rst index 57384b05fe23c9..0decbaf39749c0 100644 --- a/Doc/library/typing.rst +++ b/Doc/library/typing.rst @@ -290,7 +290,7 @@ a callable with any arbitrary parameter list would be acceptable: x = concat # Also OK ``Callable`` cannot express complex signatures such as functions that take a -variadic number of arguments, :func:`overloaded functions `, or +variadic number of arguments, :ref:`overloaded functions `, or functions that have keyword-only parameters. However, these signatures can be expressed by defining a :class:`Protocol` class with a :meth:`~object.__call__` method: @@ -1424,7 +1424,7 @@ These can be used as types in annotations. They all support subscription using Typing operator to conceptually mark an object as having been unpacked. For example, using the unpack operator ``*`` on a - :class:`type variable tuple ` is equivalent to using ``Unpack`` + :ref:`type variable tuple ` is equivalent to using ``Unpack`` to mark the type variable tuple as having been unpacked:: Ts = TypeVarTuple('Ts') @@ -1479,6 +1479,8 @@ for creating generic types. except KeyError: return default +.. _typevar: + .. class:: TypeVar(name, *constraints, bound=None, covariant=False, contravariant=False) Type variable. @@ -1573,9 +1575,11 @@ for creating generic types. A tuple containing the constraints of the type variable, if any. +.. _typevartuple: + .. class:: TypeVarTuple(name) - Type variable tuple. A specialized form of :class:`type variable ` + Type variable tuple. A specialized form of :ref:`type variable ` that enables *variadic* generics. Usage:: @@ -1686,7 +1690,7 @@ for creating generic types. .. class:: ParamSpec(name, *, bound=None, covariant=False, contravariant=False) Parameter specification variable. A specialized version of - :class:`type variables `. + :ref:`type variables `. Usage:: @@ -2483,6 +2487,8 @@ Functions and decorators .. versionadded:: 3.11 +.. _overload: + .. decorator:: overload Decorator for creating overloaded functions and methods. From ff7dc61643ce1640befdff451b4a05e0eae02c31 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 24 Oct 2023 18:26:57 +0200 Subject: [PATCH 131/265] [3.11] Revert "Fix a code snippet typo in asyncio docs (GH-108427)" (GH-111271) (GH-111273) Revert "Fix a code snippet typo in asyncio docs (GH-108427)" (GH-111271) This reverts commit 7f316763402a7d5556deecc3acd06cb719e189b3. The change resulted in a tautology and should not have been made. There may be an opportunity for additional clarity in this section, but this change wasn't it :) (cherry picked from commit c7d68f907ad3e3aa17546df92a32bddb145a69bf) Ref: https://github.com/python/cpython/pull/108427#-issuecomment-1777525740 Co-authored-by: Zachary Ware --- Doc/library/asyncio-task.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/library/asyncio-task.rst b/Doc/library/asyncio-task.rst index 858c2996b87695..0944aef0ab619d 100644 --- a/Doc/library/asyncio-task.rst +++ b/Doc/library/asyncio-task.rst @@ -539,7 +539,7 @@ Shielding From Cancellation is equivalent to:: - res = await shield(something()) + res = await something() *except* that if the coroutine containing it is cancelled, the Task running in ``something()`` is not cancelled. From the point From 652d3f33fb7c9f8cf5fe20bb27230afac8a0d5aa Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 24 Oct 2023 22:56:02 +0200 Subject: [PATCH 132/265] [3.11] gh-109017: Use non alternate name for Kyiv (GH-109251) (GH-111279) tzdata provides Kiev as an alternative to Kyiv: https://sources.debian.org/src/tzdata/2023c-10/backward/?hl=314GH-L314 But Debian moved it to the tzdata-legacy package breaking the test: https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=1050530 This patch switches to the name provided by tzdata. Also check that the new name is actually available. (cherry picked from commit 46407fe79ca78051cbf6c80e8b8e70a228f9fa50) Co-authored-by: Jochen Sprickerhof --- Lib/test/test_email/test_utils.py | 11 ++++------- 1 file changed, 4 insertions(+), 7 deletions(-) diff --git a/Lib/test/test_email/test_utils.py b/Lib/test/test_email/test_utils.py index 78afb358035e81..81109a7508ac0d 100644 --- a/Lib/test/test_email/test_utils.py +++ b/Lib/test/test_email/test_utils.py @@ -5,6 +5,7 @@ import unittest import sys import os.path +import zoneinfo class DateTimeTests(unittest.TestCase): @@ -142,13 +143,9 @@ def test_localtime_epoch_notz_daylight_false(self): t2 = utils.localtime(t0.replace(tzinfo=None)) self.assertEqual(t1, t2) - # XXX: Need a more robust test for Olson's tzdata - @unittest.skipIf(sys.platform.startswith('win'), - "Windows does not use Olson's TZ database") - @unittest.skipUnless(os.path.exists('/usr/share/zoneinfo') or - os.path.exists('/usr/lib/zoneinfo'), - "Can't find the Olson's TZ database") - @test.support.run_with_tz('Europe/Kiev') + @unittest.skipUnless("Europe/Kyiv" in zoneinfo.available_timezones(), + "Can't find a Kyiv timezone database") + @test.support.run_with_tz('Europe/Kyiv') def test_variable_tzname(self): t0 = datetime.datetime(1984, 1, 1, tzinfo=datetime.timezone.utc) t1 = utils.localtime(t0) From d38843a7b6825530a6d705132aabfb5325c498d4 Mon Sep 17 00:00:00 2001 From: Artyom Romanov <92092049+art3xa@users.noreply.github.com> Date: Wed, 25 Oct 2023 15:33:09 +0500 Subject: [PATCH 133/265] [3.11] Bump `ruff` to the latest version (GH-111288) (#111310) --- .pre-commit-config.yaml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index fd2692e490bec3..e80c5075401a99 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -1,6 +1,6 @@ repos: - repo: https://github.com/astral-sh/ruff-pre-commit - rev: v0.1.0 + rev: v0.1.2 hooks: - id: ruff name: Run Ruff on Lib/test/ From bb92fdabc7ab9d77453452d5e61f8cc4ca167616 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 25 Oct 2023 13:18:30 +0200 Subject: [PATCH 134/265] [3.11] gh-111174: Fix crash in getbuffer() called repeatedly for empty BytesIO (GH-111210) (GH-111315) (cherry picked from commit 9da98c0d9a7cc55c67fb0bd3fa162fd3b2c2629b) Co-authored-by: Serhiy Storchaka --- Lib/test/test_memoryio.py | 14 ++++++++++++++ .../2023-10-23-13-53-58.gh-issue-111174.Oohmzd.rst | 2 ++ Modules/_io/bytesio.c | 7 ++++--- 3 files changed, 20 insertions(+), 3 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-10-23-13-53-58.gh-issue-111174.Oohmzd.rst diff --git a/Lib/test/test_memoryio.py b/Lib/test/test_memoryio.py index cd2faba1791c77..731299294e6877 100644 --- a/Lib/test/test_memoryio.py +++ b/Lib/test/test_memoryio.py @@ -463,6 +463,20 @@ def test_getbuffer(self): memio.close() self.assertRaises(ValueError, memio.getbuffer) + def test_getbuffer_empty(self): + memio = self.ioclass() + buf = memio.getbuffer() + self.assertEqual(bytes(buf), b"") + # Trying to change the size of the BytesIO while a buffer is exported + # raises a BufferError. + self.assertRaises(BufferError, memio.write, b'x') + buf2 = memio.getbuffer() + self.assertRaises(BufferError, memio.write, b'x') + buf.release() + self.assertRaises(BufferError, memio.write, b'x') + buf2.release() + memio.write(b'x') + def test_read1(self): buf = self.buftype("1234567890") self.assertEqual(self.ioclass(buf).read1(), buf) diff --git a/Misc/NEWS.d/next/Library/2023-10-23-13-53-58.gh-issue-111174.Oohmzd.rst b/Misc/NEWS.d/next/Library/2023-10-23-13-53-58.gh-issue-111174.Oohmzd.rst new file mode 100644 index 00000000000000..95c315404d0ee6 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-10-23-13-53-58.gh-issue-111174.Oohmzd.rst @@ -0,0 +1,2 @@ +Fix crash in :meth:`io.BytesIO.getbuffer` called repeatedly for empty +BytesIO. diff --git a/Modules/_io/bytesio.c b/Modules/_io/bytesio.c index 930ef7e29dbcf6..e84eef4c75961e 100644 --- a/Modules/_io/bytesio.c +++ b/Modules/_io/bytesio.c @@ -124,12 +124,13 @@ unshare_buffer(bytesio *self, size_t size) static int resize_buffer(bytesio *self, size_t size) { + assert(self->buf != NULL); + assert(self->exports == 0); + /* Here, unsigned types are used to avoid dealing with signed integer overflow, which is undefined in C. */ size_t alloc = PyBytes_GET_SIZE(self->buf); - assert(self->buf != NULL); - /* For simplicity, stay in the range of the signed type. Anyway, Python doesn't allow strings to be longer than this. */ if (size > PY_SSIZE_T_MAX) @@ -1083,7 +1084,7 @@ bytesiobuf_getbuffer(bytesiobuf *obj, Py_buffer *view, int flags) "bytesiobuf_getbuffer: view==NULL argument is obsolete"); return -1; } - if (SHARED_BUF(b)) { + if (b->exports == 0 && SHARED_BUF(b)) { if (unshare_buffer(b, b->string_size) < 0) return -1; } From 14167031eb035afddd82a114ec46487f0e142c65 Mon Sep 17 00:00:00 2001 From: Serhiy Storchaka Date: Wed, 25 Oct 2023 14:57:17 +0300 Subject: [PATCH 135/265] [3.11] gh-111309: Use unittest to collect and run distutils tests (GH-111311) * use unittest.main() instead of run_unittest(test_suite()) to run tests from modules via the CLI * add explicit load_tests() to load doctests * use test.support.load_package_tests() to load tests in submodules of distutils.tests * removes no longer needed test_suite() functions --- Lib/distutils/tests/__init__.py | 36 +++++++------------ Lib/distutils/tests/test_archive_util.py | 7 ++-- Lib/distutils/tests/test_bdist.py | 7 +--- Lib/distutils/tests/test_bdist_dumb.py | 6 +--- Lib/distutils/tests/test_bdist_rpm.py | 7 ++-- Lib/distutils/tests/test_build.py | 6 +--- Lib/distutils/tests/test_build_clib.py | 7 ++-- Lib/distutils/tests/test_build_ext.py | 8 +---- Lib/distutils/tests/test_build_py.py | 7 ++-- Lib/distutils/tests/test_build_scripts.py | 6 +--- Lib/distutils/tests/test_check.py | 6 +--- Lib/distutils/tests/test_clean.py | 6 +--- Lib/distutils/tests/test_cmd.py | 7 ++-- Lib/distutils/tests/test_config.py | 6 +--- Lib/distutils/tests/test_config_cmd.py | 7 ++-- Lib/distutils/tests/test_core.py | 7 ++-- Lib/distutils/tests/test_cygwinccompiler.py | 6 +--- Lib/distutils/tests/test_dep_util.py | 6 +--- Lib/distutils/tests/test_dir_util.py | 7 ++-- Lib/distutils/tests/test_dist.py | 10 ++---- Lib/distutils/tests/test_extension.py | 6 +--- Lib/distutils/tests/test_file_util.py | 6 +--- Lib/distutils/tests/test_filelist.py | 11 ++---- Lib/distutils/tests/test_install.py | 7 ++-- Lib/distutils/tests/test_install_data.py | 6 +--- Lib/distutils/tests/test_install_headers.py | 6 +--- Lib/distutils/tests/test_install_lib.py | 7 ++-- Lib/distutils/tests/test_install_scripts.py | 6 +--- Lib/distutils/tests/test_log.py | 7 ++-- Lib/distutils/tests/test_msvc9compiler.py | 6 +--- Lib/distutils/tests/test_msvccompiler.py | 6 +--- Lib/distutils/tests/test_register.py | 6 +--- Lib/distutils/tests/test_sdist.py | 7 ++-- Lib/distutils/tests/test_spawn.py | 7 ++-- Lib/distutils/tests/test_sysconfig.py | 10 ++---- Lib/distutils/tests/test_text_file.py | 6 +--- Lib/distutils/tests/test_unixccompiler.py | 6 +--- Lib/distutils/tests/test_upload.py | 6 +--- Lib/distutils/tests/test_util.py | 6 +--- Lib/distutils/tests/test_version.py | 6 +--- Lib/distutils/tests/test_versionpredicate.py | 8 ++--- Lib/test/test_distutils.py | 12 ++----- ...-10-25-13-13-30.gh-issue-111309.Re7orL.rst | 1 + 43 files changed, 75 insertions(+), 245 deletions(-) create mode 100644 Misc/NEWS.d/next/Tests/2023-10-25-13-13-30.gh-issue-111309.Re7orL.rst diff --git a/Lib/distutils/tests/__init__.py b/Lib/distutils/tests/__init__.py index 16d011fd9ee6e7..d7922ffa011393 100644 --- a/Lib/distutils/tests/__init__.py +++ b/Lib/distutils/tests/__init__.py @@ -1,9 +1,7 @@ """Test suite for distutils. This test suite consists of a collection of test modules in the -distutils.tests package. Each test module has a name starting with -'test' and contains a function test_suite(). The function is expected -to return an initialized unittest.TestSuite instance. +distutils.tests package. Tests for the command classes in the distutils.command package are included in distutils.tests as well, instead of using a separate @@ -13,29 +11,21 @@ """ import os -import sys import unittest -from test.support import run_unittest from test.support.warnings_helper import save_restore_warnings_filters +from test.support import warnings_helper +from test.support import load_package_tests -here = os.path.dirname(__file__) or os.curdir - - -def test_suite(): - suite = unittest.TestSuite() - for fn in os.listdir(here): - if fn.startswith("test") and fn.endswith(".py"): - modname = "distutils.tests." + fn[:-3] - # bpo-40055: Save/restore warnings filters to leave them unchanged. - # Importing tests imports docutils which imports pkg_resources - # which adds a warnings filter. - with save_restore_warnings_filters(): - __import__(modname) - module = sys.modules[modname] - suite.addTest(module.test_suite()) - return suite - +def load_tests(*args): + # bpo-40055: Save/restore warnings filters to leave them unchanged. + # Importing tests imports docutils which imports pkg_resources + # which adds a warnings filter. + with (save_restore_warnings_filters(), + warnings_helper.check_warnings( + ("The distutils.sysconfig module is deprecated", DeprecationWarning), + quiet=True)): + return load_package_tests(os.path.dirname(__file__), *args) if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_archive_util.py b/Lib/distutils/tests/test_archive_util.py index 8aec84078ed48f..66aee1b44b2032 100644 --- a/Lib/distutils/tests/test_archive_util.py +++ b/Lib/distutils/tests/test_archive_util.py @@ -13,7 +13,7 @@ ARCHIVE_FORMATS) from distutils.spawn import find_executable, spawn from distutils.tests import support -from test.support import run_unittest, patch +from test.support import patch from test.support.os_helper import change_cwd from test.support.warnings_helper import check_warnings @@ -389,8 +389,5 @@ def test_tarfile_root_owner(self): finally: archive.close() -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(ArchiveUtilTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_bdist.py b/Lib/distutils/tests/test_bdist.py index 241fc9ad75f34b..c53f0ccb17239b 100644 --- a/Lib/distutils/tests/test_bdist.py +++ b/Lib/distutils/tests/test_bdist.py @@ -1,7 +1,6 @@ """Tests for distutils.command.bdist.""" import os import unittest -from test.support import run_unittest import warnings with warnings.catch_warnings(): @@ -44,9 +43,5 @@ def test_skip_build(self): '%s should take --skip-build from bdist' % name) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(BuildTestCase) - - if __name__ == '__main__': - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_bdist_dumb.py b/Lib/distutils/tests/test_bdist_dumb.py index bb860c8ac70345..b41812bccf692a 100644 --- a/Lib/distutils/tests/test_bdist_dumb.py +++ b/Lib/distutils/tests/test_bdist_dumb.py @@ -4,7 +4,6 @@ import sys import zipfile import unittest -from test.support import run_unittest from distutils.core import Distribution from distutils.command.bdist_dumb import bdist_dumb @@ -90,8 +89,5 @@ def test_simple_built(self): wanted.append('foo.%s.pyc' % sys.implementation.cache_tag) self.assertEqual(contents, sorted(wanted)) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(BuildDumbTestCase) - if __name__ == '__main__': - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_bdist_rpm.py b/Lib/distutils/tests/test_bdist_rpm.py index 7eefa7b9cad84f..ea9a0bc33623e8 100644 --- a/Lib/distutils/tests/test_bdist_rpm.py +++ b/Lib/distutils/tests/test_bdist_rpm.py @@ -3,7 +3,7 @@ import unittest import sys import os -from test.support import run_unittest, requires_zlib +from test.support import requires_zlib from distutils.core import Distribution from distutils.command.bdist_rpm import bdist_rpm @@ -134,8 +134,5 @@ def test_no_optimize_flag(self): os.remove(os.path.join(pkg_dir, 'dist', 'foo-0.1-1.noarch.rpm')) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(BuildRpmTestCase) - if __name__ == '__main__': - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_build.py b/Lib/distutils/tests/test_build.py index 71b5e164bae14a..c7c564d04774db 100644 --- a/Lib/distutils/tests/test_build.py +++ b/Lib/distutils/tests/test_build.py @@ -2,7 +2,6 @@ import unittest import os import sys -from test.support import run_unittest from distutils.command.build import build from distutils.tests import support @@ -50,8 +49,5 @@ def test_finalize_options(self): # executable is os.path.normpath(sys.executable) self.assertEqual(cmd.executable, os.path.normpath(sys.executable)) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(BuildTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_build_clib.py b/Lib/distutils/tests/test_build_clib.py index 95f928288e0048..0c3bc49e8e9760 100644 --- a/Lib/distutils/tests/test_build_clib.py +++ b/Lib/distutils/tests/test_build_clib.py @@ -5,7 +5,7 @@ import sysconfig from test.support import ( - run_unittest, missing_compiler_executable, requires_subprocess + missing_compiler_executable, requires_subprocess ) from distutils.command.build_clib import build_clib @@ -140,8 +140,5 @@ def test_run(self): # let's check the result self.assertIn('libfoo.a', os.listdir(build_temp)) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(BuildCLibTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_build_ext.py b/Lib/distutils/tests/test_build_ext.py index 4ebeafecef03c3..e89dc50665be47 100644 --- a/Lib/distutils/tests/test_build_ext.py +++ b/Lib/distutils/tests/test_build_ext.py @@ -545,11 +545,5 @@ def build_ext(self, *args, **kwargs): return build_ext -def test_suite(): - suite = unittest.TestSuite() - suite.addTest(unittest.TestLoader().loadTestsFromTestCase(BuildExtTestCase)) - suite.addTest(unittest.TestLoader().loadTestsFromTestCase(ParallelBuildExtTestCase)) - return suite - if __name__ == '__main__': - support.run_unittest(__name__) + unittest.main() diff --git a/Lib/distutils/tests/test_build_py.py b/Lib/distutils/tests/test_build_py.py index 44a06cc963aa3c..a7035e5a3fd9dd 100644 --- a/Lib/distutils/tests/test_build_py.py +++ b/Lib/distutils/tests/test_build_py.py @@ -9,7 +9,7 @@ from distutils.errors import DistutilsFileError from distutils.tests import support -from test.support import run_unittest, requires_subprocess +from test.support import requires_subprocess class BuildPyTestCase(support.TempdirManager, @@ -174,8 +174,5 @@ def test_dont_write_bytecode(self): self.logs[0][1] % self.logs[0][2]) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(BuildPyTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_build_scripts.py b/Lib/distutils/tests/test_build_scripts.py index f299e51ef79fac..485f4f050a5dba 100644 --- a/Lib/distutils/tests/test_build_scripts.py +++ b/Lib/distutils/tests/test_build_scripts.py @@ -8,7 +8,6 @@ from distutils import sysconfig from distutils.tests import support -from test.support import run_unittest class BuildScriptsTestCase(support.TempdirManager, @@ -105,8 +104,5 @@ def test_version_int(self): for name in expected: self.assertIn(name, built) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(BuildScriptsTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_check.py b/Lib/distutils/tests/test_check.py index 91bcdceb43bc69..1e86b948d22449 100644 --- a/Lib/distutils/tests/test_check.py +++ b/Lib/distutils/tests/test_check.py @@ -2,7 +2,6 @@ import os import textwrap import unittest -from test.support import run_unittest from distutils.command.check import check, HAS_DOCUTILS from distutils.tests import support @@ -156,8 +155,5 @@ def test_check_all(self): {}, **{'strict': 1, 'restructuredtext': 1}) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(CheckTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_clean.py b/Lib/distutils/tests/test_clean.py index 92367499cefc04..ccbb01e167e1cd 100644 --- a/Lib/distutils/tests/test_clean.py +++ b/Lib/distutils/tests/test_clean.py @@ -4,7 +4,6 @@ from distutils.command.clean import clean from distutils.tests import support -from test.support import run_unittest class cleanTestCase(support.TempdirManager, support.LoggingSilencer, @@ -42,8 +41,5 @@ def test_simple_run(self): cmd.ensure_finalized() cmd.run() -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(cleanTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_cmd.py b/Lib/distutils/tests/test_cmd.py index 2319214a9e332b..90310078b1c29a 100644 --- a/Lib/distutils/tests/test_cmd.py +++ b/Lib/distutils/tests/test_cmd.py @@ -1,7 +1,7 @@ """Tests for distutils.cmd.""" import unittest import os -from test.support import captured_stdout, run_unittest +from test.support import captured_stdout from distutils.cmd import Command from distutils.dist import Distribution @@ -119,8 +119,5 @@ def test_debug_print(self): finally: debug.DEBUG = False -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(CommandTestCase) - if __name__ == '__main__': - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_config.py b/Lib/distutils/tests/test_config.py index 8ab70efb161cbd..cb8e7237abd9e1 100644 --- a/Lib/distutils/tests/test_config.py +++ b/Lib/distutils/tests/test_config.py @@ -8,7 +8,6 @@ from distutils.log import WARN from distutils.tests import support -from test.support import run_unittest PYPIRC = """\ [distutils] @@ -134,8 +133,5 @@ def test_config_interpolation(self): self.assertEqual(config, waited) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(PyPIRCCommandTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_config_cmd.py b/Lib/distutils/tests/test_config_cmd.py index c79db68aae115d..4f132ce3d185cc 100644 --- a/Lib/distutils/tests/test_config_cmd.py +++ b/Lib/distutils/tests/test_config_cmd.py @@ -4,7 +4,7 @@ import sys import sysconfig from test.support import ( - run_unittest, missing_compiler_executable, requires_subprocess + missing_compiler_executable, requires_subprocess ) from distutils.command.config import dump_file, config @@ -96,8 +96,5 @@ def test_clean(self): for f in (f1, f2): self.assertFalse(os.path.exists(f)) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(ConfigTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_core.py b/Lib/distutils/tests/test_core.py index 700a22da045d45..59fb16bdec28cb 100644 --- a/Lib/distutils/tests/test_core.py +++ b/Lib/distutils/tests/test_core.py @@ -5,7 +5,7 @@ import os import shutil import sys -from test.support import captured_stdout, run_unittest +from test.support import captured_stdout from test.support import os_helper import unittest from distutils.tests import support @@ -133,8 +133,5 @@ def test_debug_mode(self): wanted = "options (after parsing config files):\n" self.assertEqual(stdout.readlines()[0], wanted) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(CoreTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_cygwinccompiler.py b/Lib/distutils/tests/test_cygwinccompiler.py index 0912ffd15c8ee9..633d3041a1bee4 100644 --- a/Lib/distutils/tests/test_cygwinccompiler.py +++ b/Lib/distutils/tests/test_cygwinccompiler.py @@ -3,7 +3,6 @@ import sys import os from io import BytesIO -from test.support import run_unittest from distutils import cygwinccompiler from distutils.cygwinccompiler import (check_config_h, @@ -147,8 +146,5 @@ def test_get_msvcr(self): '[MSC v.1999 32 bits (Intel)]') self.assertRaises(ValueError, get_msvcr) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(CygwinCCompilerTestCase) - if __name__ == '__main__': - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_dep_util.py b/Lib/distutils/tests/test_dep_util.py index 0d52740a9edda3..ef52900b91af37 100644 --- a/Lib/distutils/tests/test_dep_util.py +++ b/Lib/distutils/tests/test_dep_util.py @@ -5,7 +5,6 @@ from distutils.dep_util import newer, newer_pairwise, newer_group from distutils.errors import DistutilsFileError from distutils.tests import support -from test.support import run_unittest class DepUtilTestCase(support.TempdirManager, unittest.TestCase): @@ -73,8 +72,5 @@ def test_newer_group(self): missing='newer')) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(DepUtilTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_dir_util.py b/Lib/distutils/tests/test_dir_util.py index ebd89f320dca89..bae9c3ed56c830 100644 --- a/Lib/distutils/tests/test_dir_util.py +++ b/Lib/distutils/tests/test_dir_util.py @@ -11,7 +11,7 @@ from distutils import log from distutils.tests import support -from test.support import run_unittest, is_emscripten, is_wasi +from test.support import is_emscripten, is_wasi class DirUtilTestCase(support.TempdirManager, unittest.TestCase): @@ -136,8 +136,5 @@ def test_copy_tree_exception_in_listdir(self): dir_util.copy_tree(src, None) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(DirUtilTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_dist.py b/Lib/distutils/tests/test_dist.py index 2ef70d987f36bb..8ab24516e7073a 100644 --- a/Lib/distutils/tests/test_dist.py +++ b/Lib/distutils/tests/test_dist.py @@ -12,7 +12,7 @@ from distutils.cmd import Command from test.support import ( - captured_stdout, captured_stderr, run_unittest + captured_stdout, captured_stderr ) from test.support.os_helper import TESTFN from distutils.tests import support @@ -519,11 +519,5 @@ def test_read_metadata(self): self.assertEqual(metadata.obsoletes, None) self.assertEqual(metadata.requires, ['foo']) -def test_suite(): - suite = unittest.TestSuite() - suite.addTest(unittest.TestLoader().loadTestsFromTestCase(DistributionTestCase)) - suite.addTest(unittest.TestLoader().loadTestsFromTestCase(MetadataTestCase)) - return suite - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_extension.py b/Lib/distutils/tests/test_extension.py index 2b08930eafb10a..f9cdef25416549 100644 --- a/Lib/distutils/tests/test_extension.py +++ b/Lib/distutils/tests/test_extension.py @@ -3,7 +3,6 @@ import os import warnings -from test.support import run_unittest from test.support.warnings_helper import check_warnings from distutils.extension import read_setup_file, Extension @@ -63,8 +62,5 @@ def test_extension_init(self): self.assertEqual(str(w.warnings[0].message), "Unknown Extension options: 'chic'") -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(ExtensionTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_file_util.py b/Lib/distutils/tests/test_file_util.py index 551151b0143661..76ef6e9743d381 100644 --- a/Lib/distutils/tests/test_file_util.py +++ b/Lib/distutils/tests/test_file_util.py @@ -8,7 +8,6 @@ from distutils import log from distutils.tests import support from distutils.errors import DistutilsFileError -from test.support import run_unittest from test.support.os_helper import unlink @@ -119,8 +118,5 @@ def test_copy_file_hard_link_failure(self): self.assertEqual(f.read(), 'some content') -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(FileUtilTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_filelist.py b/Lib/distutils/tests/test_filelist.py index 98c97e49f80db5..216cf27f493e02 100644 --- a/Lib/distutils/tests/test_filelist.py +++ b/Lib/distutils/tests/test_filelist.py @@ -9,7 +9,7 @@ from distutils import filelist from test.support import os_helper -from test.support import captured_stdout, run_unittest +from test.support import captured_stdout from distutils.tests import support MANIFEST_IN = """\ @@ -329,12 +329,5 @@ def test_non_local_discovery(self): self.assertEqual(filelist.findall(temp_dir), expected) -def test_suite(): - return unittest.TestSuite([ - unittest.TestLoader().loadTestsFromTestCase(FileListTestCase), - unittest.TestLoader().loadTestsFromTestCase(FindAllTestCase), - ]) - - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_install.py b/Lib/distutils/tests/test_install.py index c38f98b8b2c294..c30414d5cb24fa 100644 --- a/Lib/distutils/tests/test_install.py +++ b/Lib/distutils/tests/test_install.py @@ -5,7 +5,7 @@ import unittest import site -from test.support import captured_stdout, run_unittest, requires_subprocess +from test.support import captured_stdout, requires_subprocess from distutils import sysconfig from distutils.command.install import install, HAS_USER_SITE @@ -254,8 +254,5 @@ def test_debug_mode(self): self.assertGreater(len(self.logs), old_logs_len) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(InstallTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_install_data.py b/Lib/distutils/tests/test_install_data.py index 6191d2fa6eefab..c5c04a36e12401 100644 --- a/Lib/distutils/tests/test_install_data.py +++ b/Lib/distutils/tests/test_install_data.py @@ -4,7 +4,6 @@ from distutils.command.install_data import install_data from distutils.tests import support -from test.support import run_unittest class InstallDataTestCase(support.TempdirManager, support.LoggingSilencer, @@ -68,8 +67,5 @@ def test_simple_run(self): self.assertTrue(os.path.exists(os.path.join(inst2, rtwo))) self.assertTrue(os.path.exists(os.path.join(inst, rone))) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(InstallDataTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_install_headers.py b/Lib/distutils/tests/test_install_headers.py index 1aa4d09cdef731..f8f513217fc6a9 100644 --- a/Lib/distutils/tests/test_install_headers.py +++ b/Lib/distutils/tests/test_install_headers.py @@ -4,7 +4,6 @@ from distutils.command.install_headers import install_headers from distutils.tests import support -from test.support import run_unittest class InstallHeadersTestCase(support.TempdirManager, support.LoggingSilencer, @@ -32,8 +31,5 @@ def test_simple_run(self): # let's check the results self.assertEqual(len(cmd.get_outputs()), 2) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(InstallHeadersTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_install_lib.py b/Lib/distutils/tests/test_install_lib.py index f840d1a94665ed..08bc9c84269376 100644 --- a/Lib/distutils/tests/test_install_lib.py +++ b/Lib/distutils/tests/test_install_lib.py @@ -8,7 +8,7 @@ from distutils.extension import Extension from distutils.tests import support from distutils.errors import DistutilsOptionError -from test.support import run_unittest, requires_subprocess +from test.support import requires_subprocess class InstallLibTestCase(support.TempdirManager, @@ -110,8 +110,5 @@ def test_dont_write_bytecode(self): self.logs[0][1] % self.logs[0][2]) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(InstallLibTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_install_scripts.py b/Lib/distutils/tests/test_install_scripts.py index 648db3b11da745..b4272d6f90faea 100644 --- a/Lib/distutils/tests/test_install_scripts.py +++ b/Lib/distutils/tests/test_install_scripts.py @@ -7,7 +7,6 @@ from distutils.core import Distribution from distutils.tests import support -from test.support import run_unittest class InstallScriptsTestCase(support.TempdirManager, @@ -75,8 +74,5 @@ def write_script(name, text): self.assertIn(name, installed) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(InstallScriptsTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_log.py b/Lib/distutils/tests/test_log.py index ec2ae028de8777..0f3250313fdead 100644 --- a/Lib/distutils/tests/test_log.py +++ b/Lib/distutils/tests/test_log.py @@ -3,7 +3,7 @@ import io import sys import unittest -from test.support import swap_attr, run_unittest +from test.support import swap_attr from distutils import log @@ -39,8 +39,5 @@ def test_non_ascii(self): 'Fαtal\trrr' if errors == 'ignore' else 'Fαtal\t\\xc8rr\\u014dr') -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(TestLog) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_msvc9compiler.py b/Lib/distutils/tests/test_msvc9compiler.py index 6235405e31201b..60b99e25f8a02c 100644 --- a/Lib/distutils/tests/test_msvc9compiler.py +++ b/Lib/distutils/tests/test_msvc9compiler.py @@ -5,7 +5,6 @@ from distutils.errors import DistutilsPlatformError from distutils.tests import support -from test.support import run_unittest # A manifest with the only assembly reference being the msvcrt assembly, so # should have the assembly completely stripped. Note that although the @@ -177,8 +176,5 @@ def test_remove_entire_manifest(self): self.assertIsNone(got) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(msvc9compilerTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_msvccompiler.py b/Lib/distutils/tests/test_msvccompiler.py index dd67c3eb6d519d..7b3a6bbe7fda9d 100644 --- a/Lib/distutils/tests/test_msvccompiler.py +++ b/Lib/distutils/tests/test_msvccompiler.py @@ -5,7 +5,6 @@ from distutils.errors import DistutilsPlatformError from distutils.tests import support -from test.support import run_unittest SKIP_MESSAGE = (None if sys.platform == "win32" else @@ -74,8 +73,5 @@ def test_get_vc2015(self): else: raise unittest.SkipTest("VS 2015 is not installed") -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(msvccompilerTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_register.py b/Lib/distutils/tests/test_register.py index 7805c6d3c9f7b9..33f9443358ff27 100644 --- a/Lib/distutils/tests/test_register.py +++ b/Lib/distutils/tests/test_register.py @@ -5,7 +5,6 @@ import urllib import warnings -from test.support import run_unittest from test.support.warnings_helper import check_warnings from distutils.command import register as register_module @@ -317,8 +316,5 @@ def test_show_response(self): self.assertEqual(results[3], 75 * '-' + '\nxxx\n' + 75 * '-') -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(RegisterTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_sdist.py b/Lib/distutils/tests/test_sdist.py index 46b3a13e470c4e..58497fe0875672 100644 --- a/Lib/distutils/tests/test_sdist.py +++ b/Lib/distutils/tests/test_sdist.py @@ -6,7 +6,7 @@ import zipfile from os.path import join from textwrap import dedent -from test.support import captured_stdout, run_unittest +from test.support import captured_stdout from test.support.warnings_helper import check_warnings try: @@ -486,8 +486,5 @@ def test_make_distribution_owner_group(self): finally: archive.close() -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(SDistTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_spawn.py b/Lib/distutils/tests/test_spawn.py index a0a1145da5df8e..30bbe2a283ba20 100644 --- a/Lib/distutils/tests/test_spawn.py +++ b/Lib/distutils/tests/test_spawn.py @@ -3,7 +3,7 @@ import stat import sys import unittest.mock -from test.support import run_unittest, unix_shell, requires_subprocess +from test.support import unix_shell, requires_subprocess from test.support import os_helper from distutils.spawn import find_executable @@ -132,8 +132,5 @@ def test_spawn_missing_exe(self): self.assertIn("command 'does-not-exist' failed", str(ctx.exception)) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(SpawnTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_sysconfig.py b/Lib/distutils/tests/test_sysconfig.py index 6833d22af5fe2a..363834fe8b3a34 100644 --- a/Lib/distutils/tests/test_sysconfig.py +++ b/Lib/distutils/tests/test_sysconfig.py @@ -10,7 +10,7 @@ from distutils import sysconfig from distutils.ccompiler import get_default_compiler from distutils.tests import support -from test.support import run_unittest, swap_item, requires_subprocess, is_wasi +from test.support import swap_item, requires_subprocess, is_wasi from test.support.os_helper import TESTFN from test.support.warnings_helper import check_warnings @@ -254,11 +254,5 @@ def test_customize_compiler_before_get_config_vars(self): self.assertEqual(0, p.returncode, "Subprocess failed: " + outs) -def test_suite(): - suite = unittest.TestSuite() - suite.addTest(unittest.TestLoader().loadTestsFromTestCase(SysconfigTestCase)) - return suite - - if __name__ == '__main__': - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_text_file.py b/Lib/distutils/tests/test_text_file.py index ebac3d52f9097c..fbf48515a1da70 100644 --- a/Lib/distutils/tests/test_text_file.py +++ b/Lib/distutils/tests/test_text_file.py @@ -3,7 +3,6 @@ import unittest from distutils.text_file import TextFile from distutils.tests import support -from test.support import run_unittest TEST_DATA = """# test file @@ -100,8 +99,5 @@ def test_input(count, description, file, expected_result): finally: in_file.close() -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(TextFileTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_unixccompiler.py b/Lib/distutils/tests/test_unixccompiler.py index a3484d4f94cd92..6a6592527c7659 100644 --- a/Lib/distutils/tests/test_unixccompiler.py +++ b/Lib/distutils/tests/test_unixccompiler.py @@ -1,7 +1,6 @@ """Tests for distutils.unixccompiler.""" import sys import unittest -from test.support import run_unittest from test.support.os_helper import EnvironmentVarGuard from distutils import sysconfig @@ -138,8 +137,5 @@ def gcv(v): self.assertEqual(self.cc.linker_so[0], 'my_ld') -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(UnixCCompilerTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_upload.py b/Lib/distutils/tests/test_upload.py index d6797414883060..9aae88bed582da 100644 --- a/Lib/distutils/tests/test_upload.py +++ b/Lib/distutils/tests/test_upload.py @@ -4,7 +4,6 @@ import unittest.mock as mock from urllib.error import HTTPError -from test.support import run_unittest from distutils.command import upload as upload_mod from distutils.command.upload import upload @@ -216,8 +215,5 @@ def test_wrong_exception_order(self): self.clear_logs() -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(uploadTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_util.py b/Lib/distutils/tests/test_util.py index f9c223f06ec68d..1b002cb35be1d0 100644 --- a/Lib/distutils/tests/test_util.py +++ b/Lib/distutils/tests/test_util.py @@ -3,7 +3,6 @@ import sys import unittest from copy import copy -from test.support import run_unittest from unittest import mock from distutils.errors import DistutilsPlatformError, DistutilsByteCompileError @@ -306,8 +305,5 @@ def test_grok_environment_error(self): self.assertEqual(msg, "error: Unable to find batch file") -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(UtilTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_version.py b/Lib/distutils/tests/test_version.py index 1563e0227b60dd..f102f2953ff50f 100644 --- a/Lib/distutils/tests/test_version.py +++ b/Lib/distutils/tests/test_version.py @@ -2,7 +2,6 @@ import unittest from distutils.version import LooseVersion from distutils.version import StrictVersion -from test.support import run_unittest class VersionTestCase(unittest.TestCase): @@ -80,8 +79,5 @@ def test_cmp(self): 'cmp(%s, %s) should be NotImplemented, got %s' % (v1, v2, res)) -def test_suite(): - return unittest.TestLoader().loadTestsFromTestCase(VersionTestCase) - if __name__ == "__main__": - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/distutils/tests/test_versionpredicate.py b/Lib/distutils/tests/test_versionpredicate.py index 28ae09dc2058da..b0e9ab2cf59b59 100644 --- a/Lib/distutils/tests/test_versionpredicate.py +++ b/Lib/distutils/tests/test_versionpredicate.py @@ -4,10 +4,10 @@ import distutils.versionpredicate import doctest -from test.support import run_unittest -def test_suite(): - return doctest.DocTestSuite(distutils.versionpredicate) +def load_tests(loader, tests, pattern): + tests.addTest(doctest.DocTestSuite(distutils.versionpredicate)) + return tests if __name__ == '__main__': - run_unittest(test_suite()) + unittest.main() diff --git a/Lib/test/test_distutils.py b/Lib/test/test_distutils.py index 28320fb5c0bfd1..12d1472b975c78 100644 --- a/Lib/test/test_distutils.py +++ b/Lib/test/test_distutils.py @@ -1,8 +1,6 @@ """Tests for distutils. -The tests for distutils are defined in the distutils.tests package; -the test_suite() function there returns a test suite that's ready to -be run. +The tests for distutils are defined in the distutils.tests package. """ import unittest @@ -12,13 +10,7 @@ with warnings_helper.check_warnings( ("The distutils package is deprecated", DeprecationWarning), quiet=True): - import distutils.tests - - -def load_tests(*_): - # used by unittest - return distutils.tests.test_suite() - + from distutils.tests import load_tests def tearDownModule(): support.reap_children() diff --git a/Misc/NEWS.d/next/Tests/2023-10-25-13-13-30.gh-issue-111309.Re7orL.rst b/Misc/NEWS.d/next/Tests/2023-10-25-13-13-30.gh-issue-111309.Re7orL.rst new file mode 100644 index 00000000000000..ed0d3947ad3494 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-10-25-13-13-30.gh-issue-111309.Re7orL.rst @@ -0,0 +1 @@ +:mod:`distutils` tests can now be run via :mod:`unittest`. From fc9a5ef1a8894b2c61dbc762be6a8bf90906b621 Mon Sep 17 00:00:00 2001 From: Serhiy Storchaka Date: Wed, 25 Oct 2023 15:37:19 +0300 Subject: [PATCH 136/265] [3.11] [3.12] gh-111165: Move test running code from test.support to libregrtest (GH-111166) (GH-111316) (GH-111318) Remove no longer used functions run_unittest() and run_doctest() from the test.support module. (cherry picked from commit f6a45a03d0e0ef6b00c45a0de9a606b1d23cbd2f) (cherry picked from commit 5c4f9a1c7ed9fddd8c3ce4c1793af07f66f36a6b) --- Doc/library/test.rst | 35 ---- Lib/test/libregrtest/filter.py | 72 +++++++ Lib/test/libregrtest/findtests.py | 5 +- Lib/test/libregrtest/result.py | 26 ++- Lib/test/libregrtest/results.py | 3 +- Lib/test/libregrtest/setup.py | 5 +- Lib/test/libregrtest/single.py | 47 ++++- .../{support => libregrtest}/testresult.py | 0 Lib/test/support/__init__.py | 193 ------------------ Lib/test/test_regrtest.py | 118 ++++++++++- Lib/test/test_support.py | 115 ----------- ...-10-21-19-27-36.gh-issue-111165.FU6mUk.rst | 2 + 12 files changed, 266 insertions(+), 355 deletions(-) create mode 100644 Lib/test/libregrtest/filter.py rename Lib/test/{support => libregrtest}/testresult.py (100%) create mode 100644 Misc/NEWS.d/next/Tests/2023-10-21-19-27-36.gh-issue-111165.FU6mUk.rst diff --git a/Doc/library/test.rst b/Doc/library/test.rst index aa67a9480ddfcb..a27e88fabe2939 100644 --- a/Doc/library/test.rst +++ b/Doc/library/test.rst @@ -499,34 +499,6 @@ The :mod:`test.support` module defines the following functions: Define match patterns on test filenames and test method names for filtering tests. -.. function:: run_unittest(*classes) - - Execute :class:`unittest.TestCase` subclasses passed to the function. The - function scans the classes for methods starting with the prefix ``test_`` - and executes the tests individually. - - It is also legal to pass strings as parameters; these should be keys in - ``sys.modules``. Each associated module will be scanned by - ``unittest.TestLoader.loadTestsFromModule()``. This is usually seen in the - following :func:`test_main` function:: - - def test_main(): - support.run_unittest(__name__) - - This will run all tests defined in the named module. - - -.. function:: run_doctest(module, verbosity=None, optionflags=0) - - Run :func:`doctest.testmod` on the given *module*. Return - ``(failure_count, test_count)``. - - If *verbosity* is ``None``, :func:`doctest.testmod` is run with verbosity - set to :data:`verbose`. Otherwise, it is run with verbosity set to - ``None``. *optionflags* is passed as ``optionflags`` to - :func:`doctest.testmod`. - - .. function:: setswitchinterval(interval) Set the :func:`sys.setswitchinterval` to the given *interval*. Defines @@ -1052,13 +1024,6 @@ The :mod:`test.support` module defines the following classes: Try to match a single stored value (*dv*) with a supplied value (*v*). -.. class:: BasicTestRunner() - - .. method:: run(test) - - Run *test* and return the result. - - :mod:`test.support.socket_helper` --- Utilities for socket tests ================================================================ diff --git a/Lib/test/libregrtest/filter.py b/Lib/test/libregrtest/filter.py new file mode 100644 index 00000000000000..817624d79e9263 --- /dev/null +++ b/Lib/test/libregrtest/filter.py @@ -0,0 +1,72 @@ +import itertools +import operator +import re + + +# By default, don't filter tests +_test_matchers = () +_test_patterns = () + + +def match_test(test): + # Function used by support.run_unittest() and regrtest --list-cases + result = False + for matcher, result in reversed(_test_matchers): + if matcher(test.id()): + return result + return not result + + +def _is_full_match_test(pattern): + # If a pattern contains at least one dot, it's considered + # as a full test identifier. + # Example: 'test.test_os.FileTests.test_access'. + # + # ignore patterns which contain fnmatch patterns: '*', '?', '[...]' + # or '[!...]'. For example, ignore 'test_access*'. + return ('.' in pattern) and (not re.search(r'[?*\[\]]', pattern)) + + +def set_match_tests(patterns): + global _test_matchers, _test_patterns + + if not patterns: + _test_matchers = () + _test_patterns = () + else: + itemgetter = operator.itemgetter + patterns = tuple(patterns) + if patterns != _test_patterns: + _test_matchers = [ + (_compile_match_function(map(itemgetter(0), it)), result) + for result, it in itertools.groupby(patterns, itemgetter(1)) + ] + _test_patterns = patterns + + +def _compile_match_function(patterns): + patterns = list(patterns) + + if all(map(_is_full_match_test, patterns)): + # Simple case: all patterns are full test identifier. + # The test.bisect_cmd utility only uses such full test identifiers. + return set(patterns).__contains__ + else: + import fnmatch + regex = '|'.join(map(fnmatch.translate, patterns)) + # The search *is* case sensitive on purpose: + # don't use flags=re.IGNORECASE + regex_match = re.compile(regex).match + + def match_test_regex(test_id, regex_match=regex_match): + if regex_match(test_id): + # The regex matches the whole identifier, for example + # 'test.test_os.FileTests.test_access'. + return True + else: + # Try to match parts of the test identifier. + # For example, split 'test.test_os.FileTests.test_access' + # into: 'test', 'test_os', 'FileTests' and 'test_access'. + return any(map(regex_match, test_id.split("."))) + + return match_test_regex diff --git a/Lib/test/libregrtest/findtests.py b/Lib/test/libregrtest/findtests.py index dd39fe1ce20a77..a4235b52eca0af 100644 --- a/Lib/test/libregrtest/findtests.py +++ b/Lib/test/libregrtest/findtests.py @@ -4,6 +4,7 @@ from test import support +from .filter import match_test, set_match_tests from .utils import ( StrPath, TestName, TestTuple, TestList, TestFilter, abs_module_name, count, printlist) @@ -78,14 +79,14 @@ def _list_cases(suite): if isinstance(test, unittest.TestSuite): _list_cases(test) elif isinstance(test, unittest.TestCase): - if support.match_test(test): + if match_test(test): print(test.id()) def list_cases(tests: TestTuple, *, match_tests: TestFilter | None = None, test_dir: StrPath | None = None): support.verbose = False - support.set_match_tests(match_tests) + set_match_tests(match_tests) skipped = [] for test_name in tests: diff --git a/Lib/test/libregrtest/result.py b/Lib/test/libregrtest/result.py index d6b0d5ad383a5b..8bfd3665ac93d5 100644 --- a/Lib/test/libregrtest/result.py +++ b/Lib/test/libregrtest/result.py @@ -2,13 +2,35 @@ import json from typing import Any -from test.support import TestStats - from .utils import ( StrJSON, TestName, FilterTuple, format_duration, normalize_test_name, print_warning) +@dataclasses.dataclass(slots=True) +class TestStats: + tests_run: int = 0 + failures: int = 0 + skipped: int = 0 + + @staticmethod + def from_unittest(result): + return TestStats(result.testsRun, + len(result.failures), + len(result.skipped)) + + @staticmethod + def from_doctest(results): + return TestStats(results.attempted, + results.failed, + results.skipped) + + def accumulate(self, stats): + self.tests_run += stats.tests_run + self.failures += stats.failures + self.skipped += stats.skipped + + # Avoid enum.Enum to reduce the number of imports when tests are run class State: PASSED = "PASSED" diff --git a/Lib/test/libregrtest/results.py b/Lib/test/libregrtest/results.py index 3708078ff0bf3a..1feb43f8c074db 100644 --- a/Lib/test/libregrtest/results.py +++ b/Lib/test/libregrtest/results.py @@ -1,8 +1,7 @@ import sys -from test.support import TestStats from .runtests import RunTests -from .result import State, TestResult +from .result import State, TestResult, TestStats from .utils import ( StrPath, TestName, TestTuple, TestList, FilterDict, printlist, count, format_duration) diff --git a/Lib/test/libregrtest/setup.py b/Lib/test/libregrtest/setup.py index 6a96b051394d20..97edba9f87d7f9 100644 --- a/Lib/test/libregrtest/setup.py +++ b/Lib/test/libregrtest/setup.py @@ -8,6 +8,7 @@ from test import support from test.support.os_helper import TESTFN_UNDECODABLE, FS_NONASCII +from .filter import set_match_tests from .runtests import RunTests from .utils import ( setup_unraisable_hook, setup_threading_excepthook, fix_umask, @@ -92,11 +93,11 @@ def setup_tests(runtests: RunTests): support.PGO = runtests.pgo support.PGO_EXTENDED = runtests.pgo_extended - support.set_match_tests(runtests.match_tests) + set_match_tests(runtests.match_tests) if runtests.use_junit: support.junit_xml_list = [] - from test.support.testresult import RegressionTestResult + from .testresult import RegressionTestResult RegressionTestResult.USE_XML = True else: support.junit_xml_list = None diff --git a/Lib/test/libregrtest/single.py b/Lib/test/libregrtest/single.py index 0304f858edf42c..b4ae29905721dc 100644 --- a/Lib/test/libregrtest/single.py +++ b/Lib/test/libregrtest/single.py @@ -9,13 +9,14 @@ import unittest from test import support -from test.support import TestStats from test.support import threading_helper -from .result import State, TestResult +from .filter import match_test +from .result import State, TestResult, TestStats from .runtests import RunTests from .save_env import saved_test_environment from .setup import setup_tests +from .testresult import get_test_runner from .utils import ( TestName, clear_caches, remove_testfn, abs_module_name, print_warning) @@ -33,7 +34,47 @@ def run_unittest(test_mod): print(error, file=sys.stderr) if loader.errors: raise Exception("errors while loading tests") - return support.run_unittest(tests) + _filter_suite(tests, match_test) + return _run_suite(tests) + +def _filter_suite(suite, pred): + """Recursively filter test cases in a suite based on a predicate.""" + newtests = [] + for test in suite._tests: + if isinstance(test, unittest.TestSuite): + _filter_suite(test, pred) + newtests.append(test) + else: + if pred(test): + newtests.append(test) + suite._tests = newtests + +def _run_suite(suite): + """Run tests from a unittest.TestSuite-derived class.""" + runner = get_test_runner(sys.stdout, + verbosity=support.verbose, + capture_output=(support.junit_xml_list is not None)) + + result = runner.run(suite) + + if support.junit_xml_list is not None: + support.junit_xml_list.append(result.get_xml_element()) + + if not result.testsRun and not result.skipped and not result.errors: + raise support.TestDidNotRun + if not result.wasSuccessful(): + stats = TestStats.from_unittest(result) + if len(result.errors) == 1 and not result.failures: + err = result.errors[0][1] + elif len(result.failures) == 1 and not result.errors: + err = result.failures[0][1] + else: + err = "multiple errors occurred" + if not verbose: err += "; run in verbose mode for details" + errors = [(str(tc), exc_str) for tc, exc_str in result.errors] + failures = [(str(tc), exc_str) for tc, exc_str in result.failures] + raise support.TestFailedWithDetails(err, errors, failures, stats=stats) + return result def regrtest_runner(result: TestResult, test_func, runtests: RunTests) -> None: diff --git a/Lib/test/support/testresult.py b/Lib/test/libregrtest/testresult.py similarity index 100% rename from Lib/test/support/testresult.py rename to Lib/test/libregrtest/testresult.py diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index 2d53f635cef7d3..7ebc42e5bdd01c 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -6,9 +6,7 @@ import contextlib import dataclasses import functools -import itertools import getpass -import operator import os import re import stat @@ -19,8 +17,6 @@ import unittest import warnings -from .testresult import get_test_runner - try: from _testcapi import unicode_legacy_string @@ -39,7 +35,6 @@ "is_resource_enabled", "requires", "requires_freebsd_version", "requires_linux_version", "requires_mac_ver", "check_syntax_error", - "BasicTestRunner", "run_unittest", "run_doctest", "requires_gzip", "requires_bz2", "requires_lzma", "bigmemtest", "bigaddrspacetest", "cpython_only", "get_attribute", "requires_IEEE_754", "requires_zlib", @@ -1026,12 +1021,6 @@ def wrapper(self): #======================================================================= # unittest integration. -class BasicTestRunner: - def run(self, test): - result = unittest.TestResult() - test(result) - return result - def _id(obj): return obj @@ -1110,156 +1099,6 @@ def refcount_test(test): return no_tracing(cpython_only(test)) -def _filter_suite(suite, pred): - """Recursively filter test cases in a suite based on a predicate.""" - newtests = [] - for test in suite._tests: - if isinstance(test, unittest.TestSuite): - _filter_suite(test, pred) - newtests.append(test) - else: - if pred(test): - newtests.append(test) - suite._tests = newtests - -@dataclasses.dataclass(slots=True) -class TestStats: - tests_run: int = 0 - failures: int = 0 - skipped: int = 0 - - @staticmethod - def from_unittest(result): - return TestStats(result.testsRun, - len(result.failures), - len(result.skipped)) - - @staticmethod - def from_doctest(results): - return TestStats(results.attempted, - results.failed) - - def accumulate(self, stats): - self.tests_run += stats.tests_run - self.failures += stats.failures - self.skipped += stats.skipped - - -def _run_suite(suite): - """Run tests from a unittest.TestSuite-derived class.""" - runner = get_test_runner(sys.stdout, - verbosity=verbose, - capture_output=(junit_xml_list is not None)) - - result = runner.run(suite) - - if junit_xml_list is not None: - junit_xml_list.append(result.get_xml_element()) - - if not result.testsRun and not result.skipped and not result.errors: - raise TestDidNotRun - if not result.wasSuccessful(): - stats = TestStats.from_unittest(result) - if len(result.errors) == 1 and not result.failures: - err = result.errors[0][1] - elif len(result.failures) == 1 and not result.errors: - err = result.failures[0][1] - else: - err = "multiple errors occurred" - if not verbose: err += "; run in verbose mode for details" - errors = [(str(tc), exc_str) for tc, exc_str in result.errors] - failures = [(str(tc), exc_str) for tc, exc_str in result.failures] - raise TestFailedWithDetails(err, errors, failures, stats=stats) - return result - - -# By default, don't filter tests -_test_matchers = () -_test_patterns = () - - -def match_test(test): - # Function used by support.run_unittest() and regrtest --list-cases - result = False - for matcher, result in reversed(_test_matchers): - if matcher(test.id()): - return result - return not result - - -def _is_full_match_test(pattern): - # If a pattern contains at least one dot, it's considered - # as a full test identifier. - # Example: 'test.test_os.FileTests.test_access'. - # - # ignore patterns which contain fnmatch patterns: '*', '?', '[...]' - # or '[!...]'. For example, ignore 'test_access*'. - return ('.' in pattern) and (not re.search(r'[?*\[\]]', pattern)) - - -def set_match_tests(patterns): - global _test_matchers, _test_patterns - - if not patterns: - _test_matchers = () - _test_patterns = () - else: - itemgetter = operator.itemgetter - patterns = tuple(patterns) - if patterns != _test_patterns: - _test_matchers = [ - (_compile_match_function(map(itemgetter(0), it)), result) - for result, it in itertools.groupby(patterns, itemgetter(1)) - ] - _test_patterns = patterns - - -def _compile_match_function(patterns): - patterns = list(patterns) - - if all(map(_is_full_match_test, patterns)): - # Simple case: all patterns are full test identifier. - # The test.bisect_cmd utility only uses such full test identifiers. - return set(patterns).__contains__ - else: - import fnmatch - regex = '|'.join(map(fnmatch.translate, patterns)) - # The search *is* case sensitive on purpose: - # don't use flags=re.IGNORECASE - regex_match = re.compile(regex).match - - def match_test_regex(test_id, regex_match=regex_match): - if regex_match(test_id): - # The regex matches the whole identifier, for example - # 'test.test_os.FileTests.test_access'. - return True - else: - # Try to match parts of the test identifier. - # For example, split 'test.test_os.FileTests.test_access' - # into: 'test', 'test_os', 'FileTests' and 'test_access'. - return any(map(regex_match, test_id.split("."))) - - return match_test_regex - - -def run_unittest(*classes): - """Run tests from unittest.TestCase-derived classes.""" - valid_types = (unittest.TestSuite, unittest.TestCase) - loader = unittest.TestLoader() - suite = unittest.TestSuite() - for cls in classes: - if isinstance(cls, str): - if cls in sys.modules: - suite.addTest(loader.loadTestsFromModule(sys.modules[cls])) - else: - raise ValueError("str arguments must be keys in sys.modules") - elif isinstance(cls, valid_types): - suite.addTest(cls) - else: - suite.addTest(loader.loadTestsFromTestCase(cls)) - _filter_suite(suite, match_test) - return _run_suite(suite) - #======================================================================= # Check for the presence of docstrings. @@ -1280,38 +1119,6 @@ def _check_docstrings(): "test requires docstrings") -#======================================================================= -# doctest driver. - -def run_doctest(module, verbosity=None, optionflags=0): - """Run doctest on the given module. Return (#failures, #tests). - - If optional argument verbosity is not specified (or is None), pass - support's belief about verbosity on to doctest. Else doctest's - usual behavior is used (it searches sys.argv for -v). - """ - - import doctest - - if verbosity is None: - verbosity = verbose - else: - verbosity = None - - results = doctest.testmod(module, - verbose=verbosity, - optionflags=optionflags) - if results.failed: - stats = TestStats.from_doctest(results) - raise TestFailed(f"{results.failed} of {results.attempted} " - f"doctests failed", - stats=stats) - if verbose: - print('doctest (%s) ... %d tests with zero failures' % - (module.__name__, results.attempted)) - return results - - #======================================================================= # Support for saving and restoring the imported modules. diff --git a/Lib/test/test_regrtest.py b/Lib/test/test_regrtest.py index 0c12b0efff88e5..aa3f65487a288e 100644 --- a/Lib/test/test_regrtest.py +++ b/Lib/test/test_regrtest.py @@ -22,11 +22,13 @@ import textwrap import unittest from test import support -from test.support import os_helper, TestStats +from test.support import os_helper from test.libregrtest import cmdline from test.libregrtest import main from test.libregrtest import setup from test.libregrtest import utils +from test.libregrtest.filter import set_match_tests, match_test +from test.libregrtest.result import TestStats from test.libregrtest.utils import normalize_test_name if not support.has_subprocess_support: @@ -2182,6 +2184,120 @@ def test_format_resources(self): format_resources((*ALL_RESOURCES, "tzdata")), 'resources: all,tzdata') + def test_match_test(self): + class Test: + def __init__(self, test_id): + self.test_id = test_id + + def id(self): + return self.test_id + + test_access = Test('test.test_os.FileTests.test_access') + test_chdir = Test('test.test_os.Win32ErrorTests.test_chdir') + test_copy = Test('test.test_shutil.TestCopy.test_copy') + + # Test acceptance + with support.swap_attr(support, '_test_matchers', ()): + # match all + set_match_tests([]) + self.assertTrue(match_test(test_access)) + self.assertTrue(match_test(test_chdir)) + + # match all using None + set_match_tests(None) + self.assertTrue(match_test(test_access)) + self.assertTrue(match_test(test_chdir)) + + # match the full test identifier + set_match_tests([(test_access.id(), True)]) + self.assertTrue(match_test(test_access)) + self.assertFalse(match_test(test_chdir)) + + # match the module name + set_match_tests([('test_os', True)]) + self.assertTrue(match_test(test_access)) + self.assertTrue(match_test(test_chdir)) + self.assertFalse(match_test(test_copy)) + + # Test '*' pattern + set_match_tests([('test_*', True)]) + self.assertTrue(match_test(test_access)) + self.assertTrue(match_test(test_chdir)) + + # Test case sensitivity + set_match_tests([('filetests', True)]) + self.assertFalse(match_test(test_access)) + set_match_tests([('FileTests', True)]) + self.assertTrue(match_test(test_access)) + + # Test pattern containing '.' and a '*' metacharacter + set_match_tests([('*test_os.*.test_*', True)]) + self.assertTrue(match_test(test_access)) + self.assertTrue(match_test(test_chdir)) + self.assertFalse(match_test(test_copy)) + + # Multiple patterns + set_match_tests([(test_access.id(), True), (test_chdir.id(), True)]) + self.assertTrue(match_test(test_access)) + self.assertTrue(match_test(test_chdir)) + self.assertFalse(match_test(test_copy)) + + set_match_tests([('test_access', True), ('DONTMATCH', True)]) + self.assertTrue(match_test(test_access)) + self.assertFalse(match_test(test_chdir)) + + # Test rejection + with support.swap_attr(support, '_test_matchers', ()): + # match the full test identifier + set_match_tests([(test_access.id(), False)]) + self.assertFalse(match_test(test_access)) + self.assertTrue(match_test(test_chdir)) + + # match the module name + set_match_tests([('test_os', False)]) + self.assertFalse(match_test(test_access)) + self.assertFalse(match_test(test_chdir)) + self.assertTrue(match_test(test_copy)) + + # Test '*' pattern + set_match_tests([('test_*', False)]) + self.assertFalse(match_test(test_access)) + self.assertFalse(match_test(test_chdir)) + + # Test case sensitivity + set_match_tests([('filetests', False)]) + self.assertTrue(match_test(test_access)) + set_match_tests([('FileTests', False)]) + self.assertFalse(match_test(test_access)) + + # Test pattern containing '.' and a '*' metacharacter + set_match_tests([('*test_os.*.test_*', False)]) + self.assertFalse(match_test(test_access)) + self.assertFalse(match_test(test_chdir)) + self.assertTrue(match_test(test_copy)) + + # Multiple patterns + set_match_tests([(test_access.id(), False), (test_chdir.id(), False)]) + self.assertFalse(match_test(test_access)) + self.assertFalse(match_test(test_chdir)) + self.assertTrue(match_test(test_copy)) + + set_match_tests([('test_access', False), ('DONTMATCH', False)]) + self.assertFalse(match_test(test_access)) + self.assertTrue(match_test(test_chdir)) + + # Test mixed filters + with support.swap_attr(support, '_test_matchers', ()): + set_match_tests([('*test_os', False), ('test_access', True)]) + self.assertTrue(match_test(test_access)) + self.assertFalse(match_test(test_chdir)) + self.assertTrue(match_test(test_copy)) + + set_match_tests([('*test_os', True), ('test_access', False)]) + self.assertFalse(match_test(test_access)) + self.assertTrue(match_test(test_chdir)) + self.assertFalse(match_test(test_copy)) + if __name__ == '__main__': unittest.main() diff --git a/Lib/test/test_support.py b/Lib/test/test_support.py index 3f9159642ba178..147ae577175129 100644 --- a/Lib/test/test_support.py +++ b/Lib/test/test_support.py @@ -550,120 +550,6 @@ def test_optim_args_from_interpreter_flags(self): with self.subTest(opts=opts): self.check_options(opts, 'optim_args_from_interpreter_flags') - def test_match_test(self): - class Test: - def __init__(self, test_id): - self.test_id = test_id - - def id(self): - return self.test_id - - test_access = Test('test.test_os.FileTests.test_access') - test_chdir = Test('test.test_os.Win32ErrorTests.test_chdir') - test_copy = Test('test.test_shutil.TestCopy.test_copy') - - # Test acceptance - with support.swap_attr(support, '_test_matchers', ()): - # match all - support.set_match_tests([]) - self.assertTrue(support.match_test(test_access)) - self.assertTrue(support.match_test(test_chdir)) - - # match all using None - support.set_match_tests(None) - self.assertTrue(support.match_test(test_access)) - self.assertTrue(support.match_test(test_chdir)) - - # match the full test identifier - support.set_match_tests([(test_access.id(), True)]) - self.assertTrue(support.match_test(test_access)) - self.assertFalse(support.match_test(test_chdir)) - - # match the module name - support.set_match_tests([('test_os', True)]) - self.assertTrue(support.match_test(test_access)) - self.assertTrue(support.match_test(test_chdir)) - self.assertFalse(support.match_test(test_copy)) - - # Test '*' pattern - support.set_match_tests([('test_*', True)]) - self.assertTrue(support.match_test(test_access)) - self.assertTrue(support.match_test(test_chdir)) - - # Test case sensitivity - support.set_match_tests([('filetests', True)]) - self.assertFalse(support.match_test(test_access)) - support.set_match_tests([('FileTests', True)]) - self.assertTrue(support.match_test(test_access)) - - # Test pattern containing '.' and a '*' metacharacter - support.set_match_tests([('*test_os.*.test_*', True)]) - self.assertTrue(support.match_test(test_access)) - self.assertTrue(support.match_test(test_chdir)) - self.assertFalse(support.match_test(test_copy)) - - # Multiple patterns - support.set_match_tests([(test_access.id(), True), (test_chdir.id(), True)]) - self.assertTrue(support.match_test(test_access)) - self.assertTrue(support.match_test(test_chdir)) - self.assertFalse(support.match_test(test_copy)) - - support.set_match_tests([('test_access', True), ('DONTMATCH', True)]) - self.assertTrue(support.match_test(test_access)) - self.assertFalse(support.match_test(test_chdir)) - - # Test rejection - with support.swap_attr(support, '_test_matchers', ()): - # match the full test identifier - support.set_match_tests([(test_access.id(), False)]) - self.assertFalse(support.match_test(test_access)) - self.assertTrue(support.match_test(test_chdir)) - - # match the module name - support.set_match_tests([('test_os', False)]) - self.assertFalse(support.match_test(test_access)) - self.assertFalse(support.match_test(test_chdir)) - self.assertTrue(support.match_test(test_copy)) - - # Test '*' pattern - support.set_match_tests([('test_*', False)]) - self.assertFalse(support.match_test(test_access)) - self.assertFalse(support.match_test(test_chdir)) - - # Test case sensitivity - support.set_match_tests([('filetests', False)]) - self.assertTrue(support.match_test(test_access)) - support.set_match_tests([('FileTests', False)]) - self.assertFalse(support.match_test(test_access)) - - # Test pattern containing '.' and a '*' metacharacter - support.set_match_tests([('*test_os.*.test_*', False)]) - self.assertFalse(support.match_test(test_access)) - self.assertFalse(support.match_test(test_chdir)) - self.assertTrue(support.match_test(test_copy)) - - # Multiple patterns - support.set_match_tests([(test_access.id(), False), (test_chdir.id(), False)]) - self.assertFalse(support.match_test(test_access)) - self.assertFalse(support.match_test(test_chdir)) - self.assertTrue(support.match_test(test_copy)) - - support.set_match_tests([('test_access', False), ('DONTMATCH', False)]) - self.assertFalse(support.match_test(test_access)) - self.assertTrue(support.match_test(test_chdir)) - - # Test mixed filters - with support.swap_attr(support, '_test_matchers', ()): - support.set_match_tests([('*test_os', False), ('test_access', True)]) - self.assertTrue(support.match_test(test_access)) - self.assertFalse(support.match_test(test_chdir)) - self.assertTrue(support.match_test(test_copy)) - - support.set_match_tests([('*test_os', True), ('test_access', False)]) - self.assertFalse(support.match_test(test_access)) - self.assertTrue(support.match_test(test_chdir)) - self.assertFalse(support.match_test(test_copy)) - @unittest.skipIf(support.is_emscripten, "Unstable in Emscripten") @unittest.skipIf(support.is_wasi, "Unavailable on WASI") def test_fd_count(self): @@ -875,7 +761,6 @@ def test_copy_python_src_ignore(self): # precisionbigmemtest # bigaddrspacetest # requires_resource - # run_doctest # threading_cleanup # reap_threads # can_symlink diff --git a/Misc/NEWS.d/next/Tests/2023-10-21-19-27-36.gh-issue-111165.FU6mUk.rst b/Misc/NEWS.d/next/Tests/2023-10-21-19-27-36.gh-issue-111165.FU6mUk.rst new file mode 100644 index 00000000000000..753a3c00029ceb --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-10-21-19-27-36.gh-issue-111165.FU6mUk.rst @@ -0,0 +1,2 @@ +Remove no longer used functions ``run_unittest()`` and ``run_doctest()`` and +class ``BasicTestRunner`` from the :mod:`test.support` module. From 07664c9ddb5b5ee3371912fdc47da24bbbc7a6b3 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 25 Oct 2023 16:07:44 +0200 Subject: [PATCH 137/265] [3.11] gh-108590: Improve sqlite3 docs on encoding issues and how to handle those (GH-108699) (#111325) Add a guide for how to handle non-UTF-8 text encodings. Link to that guide from the 'text_factory' docs. (cherry picked from commit 1262e41842957c3b402fc0cf0a6eb2ea230c828f) Co-authored-by: Erlend E. Aasland Co-authored-by: Alex Waygood Co-authored-by: C.A.M. Gerlach Co-authored-by: Corvin Co-authored-by: Ezio Melotti Co-authored-by: Serhiy Storchaka --- Doc/library/sqlite3.rst | 83 +++++++++++++++++++++++++---------------- 1 file changed, 50 insertions(+), 33 deletions(-) diff --git a/Doc/library/sqlite3.rst b/Doc/library/sqlite3.rst index ef92bad596ef15..3e9b7a270ad5ef 100644 --- a/Doc/library/sqlite3.rst +++ b/Doc/library/sqlite3.rst @@ -1029,6 +1029,10 @@ Connection objects f.write('%s\n' % line) con.close() + .. seealso:: + + :ref:`sqlite3-howto-encoding` + .. method:: backup(target, *, pages=-1, progress=None, name="main", sleep=0.250) @@ -1095,6 +1099,10 @@ Connection objects .. versionadded:: 3.7 + .. seealso:: + + :ref:`sqlite3-howto-encoding` + .. method:: getlimit(category, /) Get a connection runtime limit. @@ -1253,39 +1261,8 @@ Connection objects and returns a text representation of it. The callable is invoked for SQLite values with the ``TEXT`` data type. By default, this attribute is set to :class:`str`. - If you want to return ``bytes`` instead, set *text_factory* to ``bytes``. - Example: - - .. testcode:: - - con = sqlite3.connect(":memory:") - cur = con.cursor() - - AUSTRIA = "Österreich" - - # by default, rows are returned as str - cur.execute("SELECT ?", (AUSTRIA,)) - row = cur.fetchone() - assert row[0] == AUSTRIA - - # but we can make sqlite3 always return bytestrings ... - con.text_factory = bytes - cur.execute("SELECT ?", (AUSTRIA,)) - row = cur.fetchone() - assert type(row[0]) is bytes - # the bytestrings will be encoded in UTF-8, unless you stored garbage in the - # database ... - assert row[0] == AUSTRIA.encode("utf-8") - - # we can also implement a custom text_factory ... - # here we implement one that appends "foo" to all strings - con.text_factory = lambda x: x.decode("utf-8") + "foo" - cur.execute("SELECT ?", ("bar",)) - row = cur.fetchone() - assert row[0] == "barfoo" - - con.close() + See :ref:`sqlite3-howto-encoding` for more details. .. attribute:: total_changes @@ -1423,7 +1400,6 @@ Cursor objects COMMIT; """) - .. method:: fetchone() If :attr:`~Cursor.row_factory` is ``None``, @@ -2369,6 +2345,47 @@ With some adjustments, the above recipe can be adapted to use a instead of a :class:`~collections.namedtuple`. +.. _sqlite3-howto-encoding: + +How to handle non-UTF-8 text encodings +^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ + +By default, :mod:`!sqlite3` uses :class:`str` to adapt SQLite values +with the ``TEXT`` data type. +This works well for UTF-8 encoded text, but it might fail for other encodings +and invalid UTF-8. +You can use a custom :attr:`~Connection.text_factory` to handle such cases. + +Because of SQLite's `flexible typing`_, it is not uncommon to encounter table +columns with the ``TEXT`` data type containing non-UTF-8 encodings, +or even arbitrary data. +To demonstrate, let's assume we have a database with ISO-8859-2 (Latin-2) +encoded text, for example a table of Czech-English dictionary entries. +Assuming we now have a :class:`Connection` instance :py:data:`!con` +connected to this database, +we can decode the Latin-2 encoded text using this :attr:`~Connection.text_factory`: + +.. testcode:: + + con.text_factory = lambda data: str(data, encoding="latin2") + +For invalid UTF-8 or arbitrary data in stored in ``TEXT`` table columns, +you can use the following technique, borrowed from the :ref:`unicode-howto`: + +.. testcode:: + + con.text_factory = lambda data: str(data, errors="surrogateescape") + +.. note:: + + The :mod:`!sqlite3` module API does not support strings + containing surrogates. + +.. seealso:: + + :ref:`unicode-howto` + + .. _sqlite3-explanation: Explanation From 6fea61a9e02260648fbec204e9caac6d5176cc7b Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 25 Oct 2023 17:15:50 +0200 Subject: [PATCH 138/265] [3.11] gh-111165: Add missed "support." prefix for "verbose" (GH-111327) (GH-111329) (cherry picked from commit a4981921aa2c00f3883ef593fde1dbc034e3c304) Co-authored-by: Serhiy Storchaka --- Lib/test/libregrtest/single.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Lib/test/libregrtest/single.py b/Lib/test/libregrtest/single.py index b4ae29905721dc..ad75ef54a8c3f8 100644 --- a/Lib/test/libregrtest/single.py +++ b/Lib/test/libregrtest/single.py @@ -70,7 +70,7 @@ def _run_suite(suite): err = result.failures[0][1] else: err = "multiple errors occurred" - if not verbose: err += "; run in verbose mode for details" + if not support.verbose: err += "; run in verbose mode for details" errors = [(str(tc), exc_str) for tc, exc_str in result.errors] failures = [(str(tc), exc_str) for tc, exc_str in result.failures] raise support.TestFailedWithDetails(err, errors, failures, stats=stats) From 12c7e5071cb6013df69586ccc26835742c4d86b9 Mon Sep 17 00:00:00 2001 From: Savannah Ostrowski Date: Wed, 25 Oct 2023 13:47:16 -0700 Subject: [PATCH 139/265] [3.11] GH-94438: Restore ability to jump over None tests (GH-111338) (cherry picked from commit 6640f1d) --- Lib/test/test_sys_settrace.py | 34 +++++++++++++++++++ Misc/ACKS | 1 + ...3-10-23-22-11-09.gh-issue-94438.y2pITu.rst | 1 + Objects/frameobject.c | 12 +++++-- 4 files changed, 46 insertions(+), 2 deletions(-) create mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-23-22-11-09.gh-issue-94438.y2pITu.rst diff --git a/Lib/test/test_sys_settrace.py b/Lib/test/test_sys_settrace.py index 3540192b5bb5bb..e97e122534a2f3 100644 --- a/Lib/test/test_sys_settrace.py +++ b/Lib/test/test_sys_settrace.py @@ -1952,6 +1952,40 @@ def test_jump_simple_backwards(output): output.append(1) output.append(2) + @jump_test(1, 4, [5]) + def test_jump_is_none_forwards(output): + x = None + if x is None: + output.append(3) + else: + output.append(5) + + @jump_test(6, 5, [3, 5, 6]) + def test_jump_is_none_backwards(output): + x = None + if x is None: + output.append(3) + else: + output.append(5) + output.append(6) + + @jump_test(1, 4, [5]) + def test_jump_is_not_none_forwards(output): + x = None + if x is not None: + output.append(3) + else: + output.append(5) + + @jump_test(6, 5, [5, 5, 6]) + def test_jump_is_not_none_backwards(output): + x = None + if x is not None: + output.append(3) + else: + output.append(5) + output.append(6) + @jump_test(3, 5, [2, 5]) def test_jump_out_of_block_forwards(output): for i in 1, 2: diff --git a/Misc/ACKS b/Misc/ACKS index 6eff2cc1f19e28..7efa01db59e7ea 100644 --- a/Misc/ACKS +++ b/Misc/ACKS @@ -1324,6 +1324,7 @@ Michele Orrù Tomáš Orsava Oleg Oshmyan Denis Osipov +Savannah Ostrowski Denis S. Otkidach Peter Otten Michael Otteneder diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-23-22-11-09.gh-issue-94438.y2pITu.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-23-22-11-09.gh-issue-94438.y2pITu.rst new file mode 100644 index 00000000000000..b6e147a48a8cd8 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2023-10-23-22-11-09.gh-issue-94438.y2pITu.rst @@ -0,0 +1 @@ +Fix a regression that prevented jumping across ``is None`` and ``is not None`` when debugging. Patch by Savannah Ostrowski. diff --git a/Objects/frameobject.c b/Objects/frameobject.c index 425749d14cf014..c95e8711880517 100644 --- a/Objects/frameobject.c +++ b/Objects/frameobject.c @@ -315,19 +315,27 @@ mark_stacks(PyCodeObject *code_obj, int len) case POP_JUMP_BACKWARD_IF_FALSE: case POP_JUMP_FORWARD_IF_TRUE: case POP_JUMP_BACKWARD_IF_TRUE: + case POP_JUMP_FORWARD_IF_NONE: + case POP_JUMP_BACKWARD_IF_NONE: + case POP_JUMP_FORWARD_IF_NOT_NONE: + case POP_JUMP_BACKWARD_IF_NOT_NONE: { int64_t target_stack; int j = get_arg(code, i); if (opcode == POP_JUMP_FORWARD_IF_FALSE || opcode == POP_JUMP_FORWARD_IF_TRUE || opcode == JUMP_IF_FALSE_OR_POP || - opcode == JUMP_IF_TRUE_OR_POP) + opcode == JUMP_IF_TRUE_OR_POP || + opcode == POP_JUMP_FORWARD_IF_NONE || + opcode == POP_JUMP_FORWARD_IF_NOT_NONE) { j += i + 1; } else { assert(opcode == POP_JUMP_BACKWARD_IF_FALSE || - opcode == POP_JUMP_BACKWARD_IF_TRUE); + opcode == POP_JUMP_BACKWARD_IF_TRUE || + opcode == POP_JUMP_BACKWARD_IF_NONE || + opcode == POP_JUMP_BACKWARD_IF_NOT_NONE); j = i + 1 - j; } assert(j < len); From 762aba72eb35a5696048ed89df35385c8d784c52 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 26 Oct 2023 15:12:27 +0200 Subject: [PATCH 140/265] [3.11] gh-111348: Fix direct invocation of `test_doctest`; remove `test_doctest.test_coverage` (GH-111349) (#111360) gh-111348: Fix direct invocation of `test_doctest`; remove `test_doctest.test_coverage` (GH-111349) (cherry picked from commit 31c05b72c15885ad5ff298de39456d8baed28448) Co-authored-by: Nikita Sobolev Co-authored-by: Sergey B Kirpichev --- Lib/test/test_doctest.py | 16 +--------------- 1 file changed, 1 insertion(+), 15 deletions(-) diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py index fbeed96e5bee36..98bda46d312909 100644 --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -3318,19 +3318,5 @@ def load_tests(loader, tests, pattern): return tests -def test_coverage(coverdir): - trace = import_helper.import_module('trace') - tracer = trace.Trace(ignoredirs=[sys.base_prefix, sys.base_exec_prefix,], - trace=0, count=1) - tracer.run('test_main()') - r = tracer.results() - print('Writing coverage results...') - r.write_results(show_missing=True, summary=True, - coverdir=coverdir) - - if __name__ == '__main__': - if '-c' in sys.argv: - test_coverage('/tmp/doctest.cover') - else: - unittest.main() + unittest.main(module='test.test_doctest') From 22cde39fbf65df8790d1a93f199affa8a0361c4b Mon Sep 17 00:00:00 2001 From: Pablo Galindo Salgado Date: Fri, 27 Oct 2023 09:46:20 +0900 Subject: [PATCH 141/265] [3.11] bpo-43950: handle wide unicode characters in tracebacks (GH-28150) (#111373) --- Lib/test/test_traceback.py | 57 ++++++++++++++++++- Lib/traceback.py | 53 +++++++++++++---- ...3-10-26-15-34-11.gh-issue-88116.W9-vaQ.rst | 3 + Parser/pegen.c | 55 ++++++++++++++++++ Parser/pegen.h | 1 + Python/traceback.c | 35 +++++++++++- 6 files changed, 189 insertions(+), 15 deletions(-) create mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-26-15-34-11.gh-issue-88116.W9-vaQ.rst diff --git a/Lib/test/test_traceback.py b/Lib/test/test_traceback.py index ccc59870c2a952..2881a3489084a0 100644 --- a/Lib/test/test_traceback.py +++ b/Lib/test/test_traceback.py @@ -893,7 +893,62 @@ def f(): f" callable()", f" File \"{__file__}\", line {f.__code__.co_firstlineno + 4}, in f", f" print(1, www(", - f" ^^^^", + f" ^^^^^^^", + ] + self.assertEqual(actual, expected) + + def test_byte_offset_with_wide_characters_term_highlight(self): + def f(): + 说明说明 = 1 + şçöğıĤellö = 0 # not wide but still non-ascii + return 说明说明 / şçöğıĤellö + + actual = self.get_exception(f) + expected = [ + f"Traceback (most recent call last):", + f" File \"{__file__}\", line {self.callable_line}, in get_exception", + f" callable()", + f" File \"{__file__}\", line {f.__code__.co_firstlineno + 3}, in f", + f" return 说明说明 / şçöğıĤellö", + f" ~~~~~~~~~^~~~~~~~~~~~", + ] + self.assertEqual(actual, expected) + + def test_byte_offset_with_emojis_term_highlight(self): + def f(): + return "✨🐍" + func_说明说明("📗🚛", + "📗🚛") + "🐍" + + actual = self.get_exception(f) + expected = [ + f"Traceback (most recent call last):", + f" File \"{__file__}\", line {self.callable_line}, in get_exception", + f" callable()", + f" File \"{__file__}\", line {f.__code__.co_firstlineno + 1}, in f", + f' return "✨🐍" + func_说明说明("📗🚛",', + f" ^^^^^^^^^^^^^", + ] + self.assertEqual(actual, expected) + + def test_byte_offset_wide_chars_subscript(self): + def f(): + my_dct = { + "✨🚛✨": { + "说明": { + "🐍🐍🐍": None + } + } + } + return my_dct["✨🚛✨"]["说明"]["🐍"]["说明"]["🐍🐍"] + + actual = self.get_exception(f) + expected = [ + f"Traceback (most recent call last):", + f" File \"{__file__}\", line {self.callable_line}, in get_exception", + f" callable()", + f" File \"{__file__}\", line {f.__code__.co_firstlineno + 8}, in f", + f' return my_dct["✨🚛✨"]["说明"]["🐍"]["说明"]["🐍🐍"]', + f" ~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^", ] self.assertEqual(actual, expected) diff --git a/Lib/traceback.py b/Lib/traceback.py index 0e229553cb5d25..ea045e27610d4d 100644 --- a/Lib/traceback.py +++ b/Lib/traceback.py @@ -465,7 +465,8 @@ def format_frame_summary(self, frame_summary): stripped_line = frame_summary.line.strip() row.append(' {}\n'.format(stripped_line)) - orig_line_len = len(frame_summary._original_line) + line = frame_summary._original_line + orig_line_len = len(line) frame_line_len = len(frame_summary.line.lstrip()) stripped_characters = orig_line_len - frame_line_len if ( @@ -473,31 +474,40 @@ def format_frame_summary(self, frame_summary): and frame_summary.end_colno is not None ): start_offset = _byte_offset_to_character_offset( - frame_summary._original_line, frame_summary.colno) + 1 + line, frame_summary.colno) end_offset = _byte_offset_to_character_offset( - frame_summary._original_line, frame_summary.end_colno) + 1 + line, frame_summary.end_colno) + code_segment = line[start_offset:end_offset] anchors = None if frame_summary.lineno == frame_summary.end_lineno: with suppress(Exception): - anchors = _extract_caret_anchors_from_line_segment( - frame_summary._original_line[start_offset - 1:end_offset - 1] - ) + anchors = _extract_caret_anchors_from_line_segment(code_segment) else: - end_offset = stripped_characters + len(stripped_line) + # Don't count the newline since the anchors only need to + # go up until the last character of the line. + end_offset = len(line.rstrip()) # show indicators if primary char doesn't span the frame line if end_offset - start_offset < len(stripped_line) or ( anchors and anchors.right_start_offset - anchors.left_end_offset > 0): + # When showing this on a terminal, some of the non-ASCII characters + # might be rendered as double-width characters, so we need to take + # that into account when calculating the length of the line. + dp_start_offset = _display_width(line, start_offset) + 1 + dp_end_offset = _display_width(line, end_offset) + 1 + row.append(' ') - row.append(' ' * (start_offset - stripped_characters)) + row.append(' ' * (dp_start_offset - stripped_characters)) if anchors: - row.append(anchors.primary_char * (anchors.left_end_offset)) - row.append(anchors.secondary_char * (anchors.right_start_offset - anchors.left_end_offset)) - row.append(anchors.primary_char * (end_offset - start_offset - anchors.right_start_offset)) + dp_left_end_offset = _display_width(code_segment, anchors.left_end_offset) + dp_right_start_offset = _display_width(code_segment, anchors.right_start_offset) + row.append(anchors.primary_char * dp_left_end_offset) + row.append(anchors.secondary_char * (dp_right_start_offset - dp_left_end_offset)) + row.append(anchors.primary_char * (dp_end_offset - dp_start_offset - dp_right_start_offset)) else: - row.append('^' * (end_offset - start_offset)) + row.append('^' * (dp_end_offset - dp_start_offset)) row.append('\n') @@ -618,6 +628,25 @@ def _extract_caret_anchors_from_line_segment(segment): return None +_WIDE_CHAR_SPECIFIERS = "WF" + +def _display_width(line, offset): + """Calculate the extra amount of width space the given source + code segment might take if it were to be displayed on a fixed + width output device. Supports wide unicode characters and emojis.""" + + # Fast track for ASCII-only strings + if line.isascii(): + return offset + + import unicodedata + + return sum( + 2 if unicodedata.east_asian_width(char) in _WIDE_CHAR_SPECIFIERS else 1 + for char in line[:offset] + ) + + class _ExceptionPrintContext: def __init__(self): diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-26-15-34-11.gh-issue-88116.W9-vaQ.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-26-15-34-11.gh-issue-88116.W9-vaQ.rst new file mode 100644 index 00000000000000..12257ef2b0b9b0 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2023-10-26-15-34-11.gh-issue-88116.W9-vaQ.rst @@ -0,0 +1,3 @@ +Traceback location ranges involving wide unicode characters (like emoji and +asian characters) now are properly highlighted. Patch by Batuhan Taskaya and +Pablo Galindo. diff --git a/Parser/pegen.c b/Parser/pegen.c index 87b47bacec553f..3b85b095beb235 100644 --- a/Parser/pegen.c +++ b/Parser/pegen.c @@ -38,6 +38,61 @@ _PyPegen_byte_offset_to_character_offset(PyObject *line, Py_ssize_t col_offset) return size; } +// Calculate the extra amount of width space the given source +// code segment might take if it were to be displayed on a fixed +// width output device. Supports wide unicode characters and emojis. +Py_ssize_t +_PyPegen_calculate_display_width(PyObject *line, Py_ssize_t character_offset) +{ + PyObject *segment = PyUnicode_Substring(line, 0, character_offset); + if (!segment) { + return -1; + } + + // Fast track for ascii strings + if (PyUnicode_IS_ASCII(segment)) { + Py_DECREF(segment); + return character_offset; + } + + PyObject *width_fn = _PyImport_GetModuleAttrString("unicodedata", "east_asian_width"); + if (!width_fn) { + return -1; + } + + Py_ssize_t width = 0; + Py_ssize_t len = PyUnicode_GET_LENGTH(segment); + for (Py_ssize_t i = 0; i < len; i++) { + PyObject *chr = PyUnicode_Substring(segment, i, i + 1); + if (!chr) { + Py_DECREF(segment); + Py_DECREF(width_fn); + return -1; + } + + PyObject *width_specifier = PyObject_CallOneArg(width_fn, chr); + Py_DECREF(chr); + if (!width_specifier) { + Py_DECREF(segment); + Py_DECREF(width_fn); + return -1; + } + + if (_PyUnicode_EqualToASCIIString(width_specifier, "W") || + _PyUnicode_EqualToASCIIString(width_specifier, "F")) { + width += 2; + } + else { + width += 1; + } + Py_DECREF(width_specifier); + } + + Py_DECREF(segment); + Py_DECREF(width_fn); + return width; +} + // Here, mark is the start of the node, while p->mark is the end. // If node==NULL, they should be the same. int diff --git a/Parser/pegen.h b/Parser/pegen.h index fe0c327b875566..2c4b2c3dfc65c6 100644 --- a/Parser/pegen.h +++ b/Parser/pegen.h @@ -143,6 +143,7 @@ expr_ty _PyPegen_name_token(Parser *p); expr_ty _PyPegen_number_token(Parser *p); void *_PyPegen_string_token(Parser *p); Py_ssize_t _PyPegen_byte_offset_to_character_offset(PyObject *line, Py_ssize_t col_offset); +Py_ssize_t _PyPegen_calculate_display_width(PyObject *segment, Py_ssize_t character_offset); // Error handling functions and APIs typedef enum { diff --git a/Python/traceback.c b/Python/traceback.c index c4f5ec877bba5d..130f945c290234 100644 --- a/Python/traceback.c +++ b/Python/traceback.c @@ -907,8 +907,39 @@ tb_displayline(PyTracebackObject* tb, PyObject *f, PyObject *filename, int linen goto done; } - if (print_error_location_carets(f, truncation, start_offset, end_offset, - right_start_offset, left_end_offset, + // Convert all offsets to display offsets (e.g. the space they would take up if printed + // on the screen). + Py_ssize_t dp_start = _PyPegen_calculate_display_width(source_line, start_offset); + if (dp_start < 0) { + err = ignore_source_errors() < 0; + goto done; + } + + Py_ssize_t dp_end = _PyPegen_calculate_display_width(source_line, end_offset); + if (dp_end < 0) { + err = ignore_source_errors() < 0; + goto done; + } + + Py_ssize_t dp_left_end = -1; + Py_ssize_t dp_right_start = -1; + if (has_secondary_ranges) { + dp_left_end = _PyPegen_calculate_display_width(source_line, left_end_offset); + if (dp_left_end < 0) { + err = ignore_source_errors() < 0; + goto done; + } + + dp_right_start = _PyPegen_calculate_display_width(source_line, right_start_offset); + if (dp_right_start < 0) { + err = ignore_source_errors() < 0; + goto done; + } + } + + + if (print_error_location_carets(f, truncation, dp_start, dp_end, + dp_right_start, dp_left_end, primary_error_char, secondary_error_char) < 0) { err = -1; goto done; From e7cdcccd2604ee26669173a6f690c0c2e2ef43bf Mon Sep 17 00:00:00 2001 From: Hugo van Kemenade Date: Fri, 27 Oct 2023 16:08:53 +0300 Subject: [PATCH 142/265] [3.11] gh-111187: Postpone removal version for locale.getdefaultlocale() to 3.15 (GH-111188) (#111326) --- Doc/library/locale.rst | 2 +- Doc/whatsnew/3.11.rst | 2 +- Lib/locale.py | 9 +++++---- .../2023-10-22-21-28-05.gh-issue-111187._W11Ab.rst | 1 + 4 files changed, 8 insertions(+), 6 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-10-22-21-28-05.gh-issue-111187._W11Ab.rst diff --git a/Doc/library/locale.rst b/Doc/library/locale.rst index 2948105d2925c4..6b9f1d3391997e 100644 --- a/Doc/library/locale.rst +++ b/Doc/library/locale.rst @@ -303,7 +303,7 @@ The :mod:`locale` module defines the following exception and functions: *language code* and *encoding* may be ``None`` if their values cannot be determined. - .. deprecated-removed:: 3.11 3.13 + .. deprecated-removed:: 3.11 3.15 .. function:: getlocale(category=LC_CTYPE) diff --git a/Doc/whatsnew/3.11.rst b/Doc/whatsnew/3.11.rst index fda4fbcb122dcf..698df284c9fd4a 100644 --- a/Doc/whatsnew/3.11.rst +++ b/Doc/whatsnew/3.11.rst @@ -1798,7 +1798,7 @@ Standard Library * :func:`importlib.resources.path` * The :func:`locale.getdefaultlocale` function is deprecated and will be - removed in Python 3.13. Use :func:`locale.setlocale`, + removed in Python 3.15. Use :func:`locale.setlocale`, :func:`locale.getpreferredencoding(False) ` and :func:`locale.getlocale` functions instead. (Contributed by Victor Stinner in :gh:`90817`.) diff --git a/Lib/locale.py b/Lib/locale.py index 7a7694e1bfb71c..f45841ed629e3a 100644 --- a/Lib/locale.py +++ b/Lib/locale.py @@ -556,10 +556,11 @@ def getdefaultlocale(envvars=('LC_ALL', 'LC_CTYPE', 'LANG', 'LANGUAGE')): """ import warnings - warnings.warn( - "Use setlocale(), getencoding() and getlocale() instead", - DeprecationWarning, stacklevel=2 - ) + warnings._deprecated( + "locale.getdefaultlocale", + "{name!r} is deprecated and slated for removal in Python {remove}. " + "Use setlocale(), getencoding() and getlocale() instead.", + remove=(3, 15)) try: # check if it's supported by the _locale module diff --git a/Misc/NEWS.d/next/Library/2023-10-22-21-28-05.gh-issue-111187._W11Ab.rst b/Misc/NEWS.d/next/Library/2023-10-22-21-28-05.gh-issue-111187._W11Ab.rst new file mode 100644 index 00000000000000..dc2424370bb96c --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-10-22-21-28-05.gh-issue-111187._W11Ab.rst @@ -0,0 +1 @@ +Postpone removal version for locale.getdefaultlocale() to Python 3.15. From a9e0455bd318cac76016a9272c89b4bb0ff9448b Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 27 Oct 2023 17:15:15 +0200 Subject: [PATCH 143/265] [3.11] gh-111276: Clarify docs and comments about the role of LC_CTYPE (GH-111319) (#111392) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Fix locale.LC_CTYPE documentation to no longer mention string.lower() et al. Those functions were removed in Python 3.0: https://docs.python.org/2/library/string.htmlGH-deprecated-string-functions Also, fix a comment in logging about locale-specific behavior of `str.lower()`. (cherry picked from commit 6d42759c5e47ab62d60a72b4ff15d29864554579) Co-authored-by: Łukasz Langa Co-authored-by: Hugo van Kemenade --- Doc/library/locale.rst | 15 ++++++++++----- Lib/logging/handlers.py | 6 ++---- 2 files changed, 12 insertions(+), 9 deletions(-) diff --git a/Doc/library/locale.rst b/Doc/library/locale.rst index 6b9f1d3391997e..5ea5957eb143cd 100644 --- a/Doc/library/locale.rst +++ b/Doc/library/locale.rst @@ -476,11 +476,16 @@ The :mod:`locale` module defines the following exception and functions: .. data:: LC_CTYPE - .. index:: pair: module; string - - Locale category for the character type functions. Depending on the settings of - this category, the functions of module :mod:`string` dealing with case change - their behaviour. + Locale category for the character type functions. Most importantly, this + category defines the text encoding, i.e. how bytes are interpreted as + Unicode codepoints. See :pep:`538` and :pep:`540` for how this variable + might be automatically coerced to ``C.UTF-8`` to avoid issues created by + invalid settings in containers or incompatible settings passed over remote + SSH connections. + + Python doesn't internally use locale-dependent character transformation functions + from ``ctype.h``. Instead, an internal ``pyctype.h`` provides locale-independent + equivalents like :c:macro:`!Py_TOLOWER`. .. data:: LC_COLLATE diff --git a/Lib/logging/handlers.py b/Lib/logging/handlers.py index eadd1b6340bcb6..81041488c8b202 100644 --- a/Lib/logging/handlers.py +++ b/Lib/logging/handlers.py @@ -833,10 +833,8 @@ class SysLogHandler(logging.Handler): "local7": LOG_LOCAL7, } - #The map below appears to be trivially lowercasing the key. However, - #there's more to it than meets the eye - in some locales, lowercasing - #gives unexpected results. See SF #1524081: in the Turkish locale, - #"INFO".lower() != "info" + # Originally added to work around GH-43683. Unnecessary since GH-50043 but kept + # for backwards compatibility. priority_map = { "DEBUG" : "debug", "INFO" : "info", From e84b06c05a90c2f32a8e904f0a43ccc8c1cc88e3 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 27 Oct 2023 20:45:52 +0200 Subject: [PATCH 144/265] [3.11] gh-111406: Fix broken link to bpython's site (GH-111407) (#111409) gh-111406: Fix broken link to bpython's site (GH-111407) (cherry picked from commit 8a158a753c48d166ebceae0687e88ae0c0725c02) Co-authored-by: Zack Cerza --- Doc/tutorial/interactive.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/tutorial/interactive.rst b/Doc/tutorial/interactive.rst index 0d3896a4832b59..4e054c4e6c2c32 100644 --- a/Doc/tutorial/interactive.rst +++ b/Doc/tutorial/interactive.rst @@ -51,4 +51,4 @@ bpython_. .. _GNU Readline: https://tiswww.case.edu/php/chet/readline/rltop.html .. _IPython: https://ipython.org/ -.. _bpython: https://www.bpython-interpreter.org/ +.. _bpython: https://bpython-interpreter.org/ From 1a01ca44d61d6d90be159771576d6868515399dd Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 28 Oct 2023 01:36:24 +0200 Subject: [PATCH 145/265] [3.11] gh-110205: Fix asyncio ThreadedChildWatcher._join_threads() (GH-110884) (#111413) - `ThreadedChildWatcher.close()` is now *officially* a no-op; `_join_threads()` never did anything. - Threads created by that class are now named `asyncio-waitpid-NNN`. - `test.test_asyncio.utils.TestCase.close_loop()` now waits for the child watcher's threads, but not forever; if a thread hangs, it raises `RuntimeError`. (cherry picked from commit c3bb10c9303503e7b55a7bdf9acfa6b3bcb699c6) Co-authored-by: Guido van Rossum --- Lib/asyncio/unix_events.py | 11 ++--------- Lib/test/test_asyncio/utils.py | 11 ++++++++--- 2 files changed, 10 insertions(+), 12 deletions(-) diff --git a/Lib/asyncio/unix_events.py b/Lib/asyncio/unix_events.py index 0d4ba72603e675..77d2670f833fbc 100644 --- a/Lib/asyncio/unix_events.py +++ b/Lib/asyncio/unix_events.py @@ -1363,14 +1363,7 @@ def is_active(self): return True def close(self): - self._join_threads() - - def _join_threads(self): - """Internal: Join all non-daemon threads""" - threads = [thread for thread in list(self._threads.values()) - if thread.is_alive() and not thread.daemon] - for thread in threads: - thread.join() + pass def __enter__(self): return self @@ -1389,7 +1382,7 @@ def __del__(self, _warn=warnings.warn): def add_child_handler(self, pid, callback, *args): loop = events.get_running_loop() thread = threading.Thread(target=self._do_waitpid, - name=f"waitpid-{next(self._pid_counter)}", + name=f"asyncio-waitpid-{next(self._pid_counter)}", args=(loop, pid, callback, args), daemon=True) self._threads[pid] = thread diff --git a/Lib/test/test_asyncio/utils.py b/Lib/test/test_asyncio/utils.py index 7940855b19efed..045e385511bbae 100644 --- a/Lib/test/test_asyncio/utils.py +++ b/Lib/test/test_asyncio/utils.py @@ -548,6 +548,7 @@ def close_loop(loop): else: loop._default_executor.shutdown(wait=True) loop.close() + policy = support.maybe_get_event_loop_policy() if policy is not None: try: @@ -557,9 +558,13 @@ def close_loop(loop): pass else: if isinstance(watcher, asyncio.ThreadedChildWatcher): - threads = list(watcher._threads.values()) - for thread in threads: - thread.join() + # Wait for subprocess to finish, but not forever + for thread in list(watcher._threads.values()): + thread.join(timeout=support.SHORT_TIMEOUT) + if thread.is_alive(): + raise RuntimeError(f"thread {thread} still alive: " + "subprocess still running") + def set_event_loop(self, loop, *, cleanup=True): if loop is None: From da1736b06a40dc2af4ecb587effd3a119cfcca80 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 28 Oct 2023 09:51:57 +0200 Subject: [PATCH 146/265] [3.11] CI: Include Python version in cache.config key (GH-111410) (#111422) CI: Include Python version in cache.config key (GH-111410) * Include Python version in cache.config key, after Python setup * Remove EOL 3.7 from branch triggers (cherry picked from commit 9d4a1a480b65196c3aabbcd2d165d1fb86d0c8e5) Co-authored-by: Hugo van Kemenade --- .github/workflows/build.yml | 12 +++++------- 1 file changed, 5 insertions(+), 7 deletions(-) diff --git a/.github/workflows/build.yml b/.github/workflows/build.yml index cebcc3ef182b8f..71901d4cab2ae5 100644 --- a/.github/workflows/build.yml +++ b/.github/workflows/build.yml @@ -12,7 +12,6 @@ on: - '3.10' - '3.9' - '3.8' - - '3.7' pull_request: branches: - 'main' @@ -20,7 +19,6 @@ on: - '3.10' - '3.9' - '3.8' - - '3.7' permissions: contents: read @@ -144,14 +142,14 @@ jobs: if: needs.check_source.outputs.run_tests == 'true' steps: - uses: actions/checkout@v4 + - uses: actions/setup-python@v4 + with: + python-version: '3.x' - name: Restore config.cache uses: actions/cache@v3 with: path: config.cache - key: ${{ github.job }}-${{ runner.os }}-${{ needs.check_source.outputs.config_hash }} - - uses: actions/setup-python@v4 - with: - python-version: '3.x' + key: ${{ github.job }}-${{ runner.os }}-${{ needs.check_source.outputs.config_hash }}-${{ env.pythonLocation }} - name: Install Dependencies run: sudo ./.github/workflows/posix-deps-apt.sh - name: Add ccache to PATH @@ -296,7 +294,7 @@ jobs: - uses: actions/checkout@v4 - name: Register gcc problem matcher run: echo "::add-matcher::.github/problem-matchers/gcc.json" - - name: Install Dependencies + - name: Install dependencies run: sudo ./.github/workflows/posix-deps-apt.sh - name: Configure OpenSSL env vars run: | From 68c03cef245e5baf43b86f77fa1313c17098dc0a Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 29 Oct 2023 01:23:38 +0200 Subject: [PATCH 147/265] [3.11] gh-111426: Remove `test_cmd.test_coverage` (GH-111427) (#111433) gh-111426: Remove `test_cmd.test_coverage` (GH-111427) (cherry picked from commit 66bea2555dc7b3dd18282cc699fe9a22dea50de3) Co-authored-by: Nikita Sobolev --- Lib/test/test_cmd.py | 12 +----------- 1 file changed, 1 insertion(+), 11 deletions(-) diff --git a/Lib/test/test_cmd.py b/Lib/test/test_cmd.py index 319801c71f776b..28f80766677e59 100644 --- a/Lib/test/test_cmd.py +++ b/Lib/test/test_cmd.py @@ -248,19 +248,9 @@ def load_tests(loader, tests, pattern): tests.addTest(doctest.DocTestSuite()) return tests -def test_coverage(coverdir): - trace = support.import_module('trace') - tracer=trace.Trace(ignoredirs=[sys.base_prefix, sys.base_exec_prefix,], - trace=0, count=1) - tracer.run('import importlib; importlib.reload(cmd); test_main()') - r=tracer.results() - print("Writing coverage results...") - r.write_results(show_missing=True, summary=True, coverdir=coverdir) if __name__ == "__main__": - if "-c" in sys.argv: - test_coverage('/tmp/cmd.cover') - elif "-i" in sys.argv: + if "-i" in sys.argv: samplecmdclass().cmdloop() else: unittest.main() From a10fc6670b439ffe74acf088caeea82ad4726e60 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 29 Oct 2023 20:39:25 +0100 Subject: [PATCH 148/265] [3.11] gh-101100: Fix sphinx warnings in `library/asyncio-eventloop.rst` (GH-111222) (#111470) gh-101100: Fix sphinx warnings in `library/asyncio-eventloop.rst` (GH-111222) * gh-101100: Fix sphinx warnings in `library/asyncio-eventloop.rst` * Update Doc/library/socket.rst * Update asyncio-eventloop.rst * Update socket.rst --------- (cherry picked from commit 46389c32750f79ab3f398a0132cd002e8a64f809) Co-authored-by: Nikita Sobolev Co-authored-by: Hugo van Kemenade --- Doc/library/asyncio-eventloop.rst | 14 ++++++++------ Doc/library/socket.rst | 7 +++++++ Doc/tools/.nitignore | 1 - 3 files changed, 15 insertions(+), 7 deletions(-) diff --git a/Doc/library/asyncio-eventloop.rst b/Doc/library/asyncio-eventloop.rst index 5e40b50056d65a..32daaa7bca2123 100644 --- a/Doc/library/asyncio-eventloop.rst +++ b/Doc/library/asyncio-eventloop.rst @@ -495,7 +495,7 @@ Opening network connections .. versionchanged:: 3.6 - The socket option :py:const:`~socket.TCP_NODELAY` is set by default + The socket option :ref:`socket.TCP_NODELAY ` is set by default for all TCP connections. .. versionchanged:: 3.7 @@ -564,7 +564,7 @@ Opening network connections * *reuse_port* tells the kernel to allow this endpoint to be bound to the same port as other existing endpoints are bound to, so long as they all set this flag when being created. This option is not supported on Windows - and some Unixes. If the :py:const:`~socket.SO_REUSEPORT` constant is not + and some Unixes. If the :ref:`socket.SO_REUSEPORT ` constant is not defined then this capability is unsupported. * *allow_broadcast* tells the kernel to allow this endpoint to send @@ -590,7 +590,8 @@ Opening network connections .. versionchanged:: 3.8.1 The *reuse_address* parameter is no longer supported, as using - :py:const:`~sockets.SO_REUSEADDR` poses a significant security concern for + :ref:`socket.SO_REUSEADDR ` + poses a significant security concern for UDP. Explicitly passing ``reuse_address=True`` will raise an exception. When multiple processes with differing UIDs assign sockets to an @@ -599,7 +600,8 @@ Opening network connections For supported platforms, *reuse_port* can be used as a replacement for similar functionality. With *reuse_port*, - :py:const:`~sockets.SO_REUSEPORT` is used instead, which specifically + :ref:`socket.SO_REUSEPORT ` + is used instead, which specifically prevents processes with differing UIDs from assigning sockets to the same socket address. @@ -741,7 +743,7 @@ Creating network servers .. versionchanged:: 3.6 Added *ssl_handshake_timeout* and *start_serving* parameters. - The socket option :py:const:`~socket.TCP_NODELAY` is set by default + The socket option :ref:`socket.TCP_NODELAY ` is set by default for all TCP connections. .. versionchanged:: 3.11 @@ -1853,7 +1855,7 @@ Set signal handlers for SIGINT and SIGTERM (This ``signals`` example only works on Unix.) -Register handlers for signals :py:data:`SIGINT` and :py:data:`SIGTERM` +Register handlers for signals :const:`~signal.SIGINT` and :const:`~signal.SIGTERM` using the :meth:`loop.add_signal_handler` method:: import asyncio diff --git a/Doc/library/socket.rst b/Doc/library/socket.rst index 7193713984f169..b5c1edaf5ad8e0 100644 --- a/Doc/library/socket.rst +++ b/Doc/library/socket.rst @@ -326,6 +326,11 @@ Constants defined then this protocol is unsupported. More constants may be available depending on the system. +.. data:: AF_UNSPEC + + :const:`AF_UNSPEC` means that + :func:`getaddrinfo` should return socket addresses for any + address family (either IPv4, IPv6, or any other) that can be used. .. data:: SOCK_STREAM SOCK_DGRAM @@ -354,6 +359,8 @@ Constants .. versionadded:: 3.2 +.. _socket-unix-constants: + .. data:: SO_* SOMAXCONN MSG_* diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index 5dc8bdc61f6ce5..e59d6198032314 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -31,7 +31,6 @@ Doc/howto/urllib2.rst Doc/library/__future__.rst Doc/library/abc.rst Doc/library/ast.rst -Doc/library/asyncio-eventloop.rst Doc/library/asyncio-extending.rst Doc/library/asyncio-policy.rst Doc/library/asyncio-stream.rst From 9a12c23fe417efc21b0c5412355bddbf0c3c6324 Mon Sep 17 00:00:00 2001 From: Serhiy Storchaka Date: Sun, 29 Oct 2023 21:45:51 +0200 Subject: [PATCH 149/265] [3.11] gh-111165: Remove documentation for moved functions (GH-111467) (GH-111472) (cherry picked from commit 4d6bdf8aabcc92303041420a96750fbc52c9f213) --- Doc/library/test.rst | 10 ---------- 1 file changed, 10 deletions(-) diff --git a/Doc/library/test.rst b/Doc/library/test.rst index a27e88fabe2939..7fc2a7c8466f66 100644 --- a/Doc/library/test.rst +++ b/Doc/library/test.rst @@ -489,16 +489,6 @@ The :mod:`test.support` module defines the following functions: rather than looking directly in the path directories. -.. function:: match_test(test) - - Determine whether *test* matches the patterns set in :func:`set_match_tests`. - - -.. function:: set_match_tests(accept_patterns=None, ignore_patterns=None) - - Define match patterns on test filenames and test method names for filtering tests. - - .. function:: setswitchinterval(interval) Set the :func:`sys.setswitchinterval` to the given *interval*. Defines From 78150c6f7bf2b21f0bddaa87e6ab7b2fbaa4a6db Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 30 Oct 2023 18:44:27 +0100 Subject: [PATCH 150/265] [3.11] gh-111284: Make multiprocessing tests with threads faster and more reliable (GH-111285) (GH-111511) (cherry picked from commit 624ace5a2f02715d084c29eaf2211cd0dd550690) Co-authored-by: Serhiy Storchaka --- Lib/test/_test_multiprocessing.py | 30 +++++++++++++++++++++--------- 1 file changed, 21 insertions(+), 9 deletions(-) diff --git a/Lib/test/_test_multiprocessing.py b/Lib/test/_test_multiprocessing.py index 3bcb1d1d46991f..3d581d695bc06e 100644 --- a/Lib/test/_test_multiprocessing.py +++ b/Lib/test/_test_multiprocessing.py @@ -2437,8 +2437,11 @@ def test_namespace(self): # # -def sqr(x, wait=0.0): - time.sleep(wait) +def sqr(x, wait=0.0, event=None): + if event is None: + time.sleep(wait) + else: + event.wait(wait) return x*x def mul(x, y): @@ -2577,10 +2580,18 @@ def test_async(self): self.assertTimingAlmostEqual(get.elapsed, TIMEOUT1) def test_async_timeout(self): - res = self.pool.apply_async(sqr, (6, TIMEOUT2 + support.SHORT_TIMEOUT)) - get = TimingWrapper(res.get) - self.assertRaises(multiprocessing.TimeoutError, get, timeout=TIMEOUT2) - self.assertTimingAlmostEqual(get.elapsed, TIMEOUT2) + p = self.Pool(3) + try: + event = threading.Event() if self.TYPE == 'threads' else None + res = p.apply_async(sqr, (6, TIMEOUT2 + support.SHORT_TIMEOUT, event)) + get = TimingWrapper(res.get) + self.assertRaises(multiprocessing.TimeoutError, get, timeout=TIMEOUT2) + self.assertTimingAlmostEqual(get.elapsed, TIMEOUT2) + finally: + if event is not None: + event.set() + p.terminate() + p.join() def test_imap(self): it = self.pool.imap(sqr, list(range(10))) @@ -2682,10 +2693,11 @@ def test_make_pool(self): def test_terminate(self): # Simulate slow tasks which take "forever" to complete + p = self.Pool(3) args = [support.LONG_TIMEOUT for i in range(10_000)] - result = self.pool.map_async(time.sleep, args, chunksize=1) - self.pool.terminate() - self.pool.join() + result = p.map_async(time.sleep, args, chunksize=1) + p.terminate() + p.join() def test_empty_iterable(self): # See Issue 12157 From a9d5966bce9bb347002d4da7672bf49fe4274ea4 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 31 Oct 2023 02:14:30 +0100 Subject: [PATCH 151/265] [3.11] gh-111347: Remove wrong assertion in test_sendfile (GH-111377) (#111462) gh-111347: Remove wrong assertion in test_sendfile (GH-111377) Windows is different. (cherry picked from commit fa35b9e89b2e207fc8bae9eb0284260d0d922e7a) Co-authored-by: zcxsythenew <30565051+zcxsythenew@users.noreply.github.com> --- Lib/test/test_asyncio/test_sendfile.py | 7 +++++-- 1 file changed, 5 insertions(+), 2 deletions(-) diff --git a/Lib/test/test_asyncio/test_sendfile.py b/Lib/test/test_asyncio/test_sendfile.py index 0198da21d77028..d33ff197bbfa1d 100644 --- a/Lib/test/test_asyncio/test_sendfile.py +++ b/Lib/test/test_asyncio/test_sendfile.py @@ -470,8 +470,11 @@ def test_sendfile_close_peer_in_the_middle_of_receiving(self): self.assertTrue(1024 <= srv_proto.nbytes < len(self.DATA), srv_proto.nbytes) - self.assertTrue(1024 <= self.file.tell() < len(self.DATA), - self.file.tell()) + if not (sys.platform == 'win32' + and isinstance(self.loop, asyncio.ProactorEventLoop)): + # On Windows, Proactor uses transmitFile, which does not update tell() + self.assertTrue(1024 <= self.file.tell() < len(self.DATA), + self.file.tell()) self.assertTrue(cli_proto.transport.is_closing()) def test_sendfile_fallback_close_peer_in_the_middle_of_receiving(self): From c5f6c6396d15549c1e68094ed9811ae4316863f5 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 31 Oct 2023 06:47:57 +0100 Subject: [PATCH 152/265] [3.11] Remove myself from typing CODEOWNERS (GH-111523) (#111526) Remove myself from typing CODEOWNERS (GH-111523) (cherry picked from commit 804a207c168b876112984709edc3a94afa433c69) Co-authored-by: Ken Jin --- .github/CODEOWNERS | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/.github/CODEOWNERS b/.github/CODEOWNERS index 1b156c829cd9aa..6c29ba321b923f 100644 --- a/.github/CODEOWNERS +++ b/.github/CODEOWNERS @@ -137,7 +137,7 @@ Lib/ast.py @isidentical **/*idlelib* @terryjreedy -**/*typing* @gvanrossum @Fidget-Spinner @JelleZijlstra @AlexWaygood +**/*typing* @gvanrossum @JelleZijlstra @AlexWaygood **/*asyncore @giampaolo **/*asynchat @giampaolo From c66f0bedebeb0f63f02999fdf5c9ce9e045ea97d Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 31 Oct 2023 08:13:29 +0100 Subject: [PATCH 153/265] [3.11] gh-111531: Tkinter: fix reference leaks in bind_class() and bind_all() (GH-111533) (GH-111536) (cherry picked from commit e3353c498d79f0f3f108a9baf8807a12e77c2ebe) Co-authored-by: Serhiy Storchaka --- Lib/tkinter/__init__.py | 4 ++-- .../Library/2023-10-31-07-46-56.gh-issue-111531.6zUV_G.rst | 2 ++ 2 files changed, 4 insertions(+), 2 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-10-31-07-46-56.gh-issue-111531.6zUV_G.rst diff --git a/Lib/tkinter/__init__.py b/Lib/tkinter/__init__.py index 6ae9839055382c..704e7b8364b8ea 100644 --- a/Lib/tkinter/__init__.py +++ b/Lib/tkinter/__init__.py @@ -1459,7 +1459,7 @@ def bind_all(self, sequence=None, func=None, add=None): An additional boolean parameter ADD specifies whether FUNC will be called additionally to the other bound function or whether it will replace the previous function. See bind for the return value.""" - return self._bind(('bind', 'all'), sequence, func, add, 0) + return self._root()._bind(('bind', 'all'), sequence, func, add, True) def unbind_all(self, sequence): """Unbind for all widgets for event SEQUENCE all functions.""" @@ -1473,7 +1473,7 @@ def bind_class(self, className, sequence=None, func=None, add=None): whether it will replace the previous function. See bind for the return value.""" - return self._bind(('bind', className), sequence, func, add, 0) + return self._root()._bind(('bind', className), sequence, func, add, True) def unbind_class(self, className, sequence): """Unbind for all widgets with bindtag CLASSNAME for event SEQUENCE diff --git a/Misc/NEWS.d/next/Library/2023-10-31-07-46-56.gh-issue-111531.6zUV_G.rst b/Misc/NEWS.d/next/Library/2023-10-31-07-46-56.gh-issue-111531.6zUV_G.rst new file mode 100644 index 00000000000000..b722f0414184b1 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-10-31-07-46-56.gh-issue-111531.6zUV_G.rst @@ -0,0 +1,2 @@ +Fix reference leaks in ``bind_class()`` and ``bind_all()`` methods of +:mod:`tkinter` widgets. From 08e4e11b758517ad614d71ff2377dd4057ffdcb1 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 31 Oct 2023 14:29:42 +0100 Subject: [PATCH 154/265] [3.11] gh-111380: Show SyntaxWarnings only once when parsing if invalid syntax is encouintered (GH-111381) (#111383) gh-111380: Show SyntaxWarnings only once when parsing if invalid syntax is encouintered (GH-111381) (cherry picked from commit 3d2f1f0b830d86f16f42c42b54d3ea4453dac318) Co-authored-by: Pablo Galindo Salgado --- Lib/test/test_string_literals.py | 12 ++++++++++++ .../2023-10-27-11-51-40.gh-issue-111380.vgSbir.rst | 2 ++ Parser/string_parser.c | 5 +++++ 3 files changed, 19 insertions(+) create mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-27-11-51-40.gh-issue-111380.vgSbir.rst diff --git a/Lib/test/test_string_literals.py b/Lib/test/test_string_literals.py index 7247b7e48bc2b6..aeec703d5f5abd 100644 --- a/Lib/test/test_string_literals.py +++ b/Lib/test/test_string_literals.py @@ -131,6 +131,18 @@ def test_eval_str_invalid_escape(self): self.assertEqual(exc.lineno, 1) self.assertEqual(exc.offset, 1) + # Check that the warning is raised ony once if there are syntax errors + + with warnings.catch_warnings(record=True) as w: + warnings.simplefilter('always', category=DeprecationWarning) + with self.assertRaises(SyntaxError) as cm: + eval("'\\e' $") + exc = cm.exception + self.assertEqual(len(w), 1) + self.assertEqual(w[0].category, DeprecationWarning) + self.assertRegex(str(w[0].message), 'invalid escape sequence') + self.assertEqual(w[0].filename, '') + def test_eval_str_invalid_octal_escape(self): for i in range(0o400, 0o1000): with self.assertWarns(DeprecationWarning): diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-27-11-51-40.gh-issue-111380.vgSbir.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-27-11-51-40.gh-issue-111380.vgSbir.rst new file mode 100644 index 00000000000000..4ce6398dbfe3b1 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2023-10-27-11-51-40.gh-issue-111380.vgSbir.rst @@ -0,0 +1,2 @@ +Fix a bug that was causing :exc:`SyntaxWarning` to appear twice when parsing +if invalid syntax is encountered later. Patch by Pablo galindo diff --git a/Parser/string_parser.c b/Parser/string_parser.c index fb2b9808af15a2..7079b82d04f8ec 100644 --- a/Parser/string_parser.c +++ b/Parser/string_parser.c @@ -11,6 +11,11 @@ static int warn_invalid_escape_sequence(Parser *p, const char *first_invalid_escape, Token *t) { + if (p->call_invalid_rules) { + // Do not report warnings if we are in the second pass of the parser + // to avoid showing the warning twice. + return 0; + } unsigned char c = *first_invalid_escape; int octal = ('4' <= c && c <= '7'); PyObject *msg = From bc9f47097ef5c44c87cb91a441c19ee532907684 Mon Sep 17 00:00:00 2001 From: Nikita Sobolev Date: Tue, 31 Oct 2023 17:00:39 +0300 Subject: [PATCH 155/265] [3.11] gh-108303: Move all inspect test files to `test_inspect/` (GH-109607) (#111543) --- Lib/test/libregrtest/findtests.py | 1 + Lib/test/test_import/__init__.py | 1 - Lib/test/test_inspect/__init__.py | 6 ++++++ Lib/test/{ => test_inspect}/inspect_fodder.py | 0 Lib/test/{ => test_inspect}/inspect_fodder2.py | 0 .../{ => test_inspect}/inspect_stock_annotations.py | 0 .../inspect_stringized_annotations.py | 0 .../inspect_stringized_annotations_2.py | 0 Lib/test/{ => test_inspect}/test_inspect.py | 11 ++++++----- Lib/test/test_tokenize.py | 2 +- Makefile.pre.in | 1 + 11 files changed, 15 insertions(+), 7 deletions(-) create mode 100644 Lib/test/test_inspect/__init__.py rename Lib/test/{ => test_inspect}/inspect_fodder.py (100%) rename Lib/test/{ => test_inspect}/inspect_fodder2.py (100%) rename Lib/test/{ => test_inspect}/inspect_stock_annotations.py (100%) rename Lib/test/{ => test_inspect}/inspect_stringized_annotations.py (100%) rename Lib/test/{ => test_inspect}/inspect_stringized_annotations_2.py (100%) rename Lib/test/{ => test_inspect}/test_inspect.py (99%) diff --git a/Lib/test/libregrtest/findtests.py b/Lib/test/libregrtest/findtests.py index a4235b52eca0af..78343775bc5b99 100644 --- a/Lib/test/libregrtest/findtests.py +++ b/Lib/test/libregrtest/findtests.py @@ -21,6 +21,7 @@ "test_concurrent_futures", "test_future_stmt", "test_gdb", + "test_inspect", "test_multiprocessing_fork", "test_multiprocessing_forkserver", "test_multiprocessing_spawn", diff --git a/Lib/test/test_import/__init__.py b/Lib/test/test_import/__init__.py index 131aebbd4e1f56..8632ac2e818fab 100644 --- a/Lib/test/test_import/__init__.py +++ b/Lib/test/test_import/__init__.py @@ -1,5 +1,4 @@ import builtins -import contextlib import errno import glob import importlib.util diff --git a/Lib/test/test_inspect/__init__.py b/Lib/test/test_inspect/__init__.py new file mode 100644 index 00000000000000..f2a39a3fe29c7f --- /dev/null +++ b/Lib/test/test_inspect/__init__.py @@ -0,0 +1,6 @@ +import os +from test import support + + +def load_tests(*args): + return support.load_package_tests(os.path.dirname(__file__), *args) diff --git a/Lib/test/inspect_fodder.py b/Lib/test/test_inspect/inspect_fodder.py similarity index 100% rename from Lib/test/inspect_fodder.py rename to Lib/test/test_inspect/inspect_fodder.py diff --git a/Lib/test/inspect_fodder2.py b/Lib/test/test_inspect/inspect_fodder2.py similarity index 100% rename from Lib/test/inspect_fodder2.py rename to Lib/test/test_inspect/inspect_fodder2.py diff --git a/Lib/test/inspect_stock_annotations.py b/Lib/test/test_inspect/inspect_stock_annotations.py similarity index 100% rename from Lib/test/inspect_stock_annotations.py rename to Lib/test/test_inspect/inspect_stock_annotations.py diff --git a/Lib/test/inspect_stringized_annotations.py b/Lib/test/test_inspect/inspect_stringized_annotations.py similarity index 100% rename from Lib/test/inspect_stringized_annotations.py rename to Lib/test/test_inspect/inspect_stringized_annotations.py diff --git a/Lib/test/inspect_stringized_annotations_2.py b/Lib/test/test_inspect/inspect_stringized_annotations_2.py similarity index 100% rename from Lib/test/inspect_stringized_annotations_2.py rename to Lib/test/test_inspect/inspect_stringized_annotations_2.py diff --git a/Lib/test/test_inspect.py b/Lib/test/test_inspect/test_inspect.py similarity index 99% rename from Lib/test/test_inspect.py rename to Lib/test/test_inspect/test_inspect.py index 650d56e39c415c..6d4050a5ded100 100644 --- a/Lib/test/test_inspect.py +++ b/Lib/test/test_inspect/test_inspect.py @@ -30,12 +30,13 @@ from test.support.import_helper import DirsOnSysPath, ready_to_import from test.support.os_helper import TESTFN from test.support.script_helper import assert_python_ok, assert_python_failure -from test import inspect_fodder as mod -from test import inspect_fodder2 as mod2 from test import support -from test import inspect_stock_annotations -from test import inspect_stringized_annotations -from test import inspect_stringized_annotations_2 + +from . import inspect_fodder as mod +from . import inspect_fodder2 as mod2 +from . import inspect_stock_annotations +from . import inspect_stringized_annotations +from . import inspect_stringized_annotations_2 # Functions tested in this suite: diff --git a/Lib/test/test_tokenize.py b/Lib/test/test_tokenize.py index 62f152fe431d5b..b5f57f693474d2 100644 --- a/Lib/test/test_tokenize.py +++ b/Lib/test/test_tokenize.py @@ -1624,7 +1624,7 @@ def test_random_files(self): # 7 more testfiles fail. Remove them also until the failure is diagnosed. testfiles.remove(os.path.join(tempdir, "test_unicode_identifiers.py")) - for f in ('buffer', 'builtin', 'fileio', 'inspect', 'os', 'platform', 'sys'): + for f in ('buffer', 'builtin', 'fileio', 'os', 'platform', 'sys'): testfiles.remove(os.path.join(tempdir, "test_%s.py") % f) if not support.is_resource_enabled("cpu"): diff --git a/Makefile.pre.in b/Makefile.pre.in index 49238902718eb8..ada866d83c0423 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -1969,6 +1969,7 @@ TESTSUBDIRS= ctypes/test \ test/test_email/data \ test/test_future_stmt \ test/test_gdb \ + test/test_inspect \ test/test_import \ test/test_import/data \ test/test_import/data/circular_imports \ From 19a266ca8961d8dcb85799adb1025e4f4fb2f087 Mon Sep 17 00:00:00 2001 From: Pablo Galindo Salgado Date: Tue, 31 Oct 2023 14:41:20 +0000 Subject: [PATCH 156/265] [3.11] gh-111366: Correctly show custom syntax error messages in the codeop module functions (GH-111384). (#111516) (cherry picked from commit cd6e0a04a16535d8bc727c84f73730c53267184e) --- Lib/codeop.py | 19 +++++++++++++------ Lib/test/test_codeop.py | 14 ++++++++++++++ ...-10-27-12-17-49.gh-issue-111366._TSknV.rst | 3 +++ 3 files changed, 30 insertions(+), 6 deletions(-) create mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-27-12-17-49.gh-issue-111366._TSknV.rst diff --git a/Lib/codeop.py b/Lib/codeop.py index 2213b69f231f92..e64911ee5ad417 100644 --- a/Lib/codeop.py +++ b/Lib/codeop.py @@ -70,8 +70,7 @@ def _maybe_compile(compiler, source, filename, symbol): return None # fallthrough - return compiler(source, filename, symbol) - + return compiler(source, filename, symbol, incomplete_input=False) def _is_syntax_error(err1, err2): rep1 = repr(err1) @@ -82,8 +81,12 @@ def _is_syntax_error(err1, err2): return True return False -def _compile(source, filename, symbol): - return compile(source, filename, symbol, PyCF_DONT_IMPLY_DEDENT | PyCF_ALLOW_INCOMPLETE_INPUT) +def _compile(source, filename, symbol, incomplete_input=True): + flags = 0 + if incomplete_input: + flags |= PyCF_ALLOW_INCOMPLETE_INPUT + flags |= PyCF_DONT_IMPLY_DEDENT + return compile(source, filename, symbol, flags) def compile_command(source, filename="", symbol="single"): r"""Compile a command and determine whether it is incomplete. @@ -114,8 +117,12 @@ class Compile: def __init__(self): self.flags = PyCF_DONT_IMPLY_DEDENT | PyCF_ALLOW_INCOMPLETE_INPUT - def __call__(self, source, filename, symbol): - codeob = compile(source, filename, symbol, self.flags, True) + def __call__(self, source, filename, symbol, **kwargs): + flags = self.flags + if kwargs.get('incomplete_input', True) is False: + flags &= ~PyCF_DONT_IMPLY_DEDENT + flags &= ~PyCF_ALLOW_INCOMPLETE_INPUT + codeob = compile(source, filename, symbol, flags, True) for feature in _features: if codeob.co_flags & feature.compiler_flag: self.flags |= feature.compiler_flag diff --git a/Lib/test/test_codeop.py b/Lib/test/test_codeop.py index 133096d25a44bc..94ea06f79d3618 100644 --- a/Lib/test/test_codeop.py +++ b/Lib/test/test_codeop.py @@ -7,6 +7,7 @@ import warnings from test import support from test.support import warnings_helper +from textwrap import dedent from codeop import compile_command, PyCF_DONT_IMPLY_DEDENT import io @@ -341,6 +342,19 @@ def test_invalid_warning(self): self.assertRegex(str(w[0].message), 'invalid escape sequence') self.assertEqual(w[0].filename, '') + def assertSyntaxErrorMatches(self, code, message): + with self.subTest(code): + with self.assertRaisesRegex(SyntaxError, message): + compile_command(code, symbol='exec') + + def test_syntax_errors(self): + self.assertSyntaxErrorMatches( + dedent("""\ + def foo(x,x): + pass + """), "duplicate argument 'x' in function definition") + + if __name__ == "__main__": unittest.main() diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-27-12-17-49.gh-issue-111366._TSknV.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-27-12-17-49.gh-issue-111366._TSknV.rst new file mode 100644 index 00000000000000..7e76ce916ea714 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2023-10-27-12-17-49.gh-issue-111366._TSknV.rst @@ -0,0 +1,3 @@ +Fix an issue in the :mod:`codeop` that was causing :exc:`SyntaxError` +exceptions raised in the presence of invalid syntax to not contain precise +error messages. Patch by Pablo Galindo From a4aa213b65ab4c9327dc9099eb03c4ac92182884 Mon Sep 17 00:00:00 2001 From: Pablo Galindo Salgado Date: Tue, 31 Oct 2023 15:59:31 +0000 Subject: [PATCH 157/265] [3.11] gh-109181: Speed up Traceback object creation by lazily compute the line number (GH-111548) (#111550) . (cherry picked from commit abb15420c11d9dda9c89f74eac8417240b321109) --- ...-10-31-14-25-21.gh-issue-109181.11h6Mc.rst | 2 ++ Python/traceback.c | 35 +++++++++++++++---- 2 files changed, 31 insertions(+), 6 deletions(-) create mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-31-14-25-21.gh-issue-109181.11h6Mc.rst diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-31-14-25-21.gh-issue-109181.11h6Mc.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-31-14-25-21.gh-issue-109181.11h6Mc.rst new file mode 100644 index 00000000000000..61a15b471cfb27 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2023-10-31-14-25-21.gh-issue-109181.11h6Mc.rst @@ -0,0 +1,2 @@ +Speed up :obj:`Traceback` object creation by lazily compute the line number. +Patch by Pablo Galindo diff --git a/Python/traceback.c b/Python/traceback.c index 130f945c290234..664a73fdfbaa31 100644 --- a/Python/traceback.c +++ b/Python/traceback.c @@ -111,6 +111,26 @@ tb_next_get(PyTracebackObject *self, void *Py_UNUSED(_)) return ret; } +static int +tb_get_lineno(PyTracebackObject* tb) { + PyFrameObject* frame = tb->tb_frame; + assert(frame != NULL); + return PyCode_Addr2Line(PyFrame_GetCode(frame), tb->tb_lasti); +} + +static PyObject * +tb_lineno_get(PyTracebackObject *self, void *Py_UNUSED(_)) +{ + int lineno = self->tb_lineno; + if (lineno == -1) { + lineno = tb_get_lineno(self); + if (lineno < 0) { + Py_RETURN_NONE; + } + } + return PyLong_FromLong(lineno); +} + static int tb_next_set(PyTracebackObject *self, PyObject *new_next, void *Py_UNUSED(_)) { @@ -157,12 +177,12 @@ static PyMethodDef tb_methods[] = { static PyMemberDef tb_memberlist[] = { {"tb_frame", T_OBJECT, OFF(tb_frame), READONLY|PY_AUDIT_READ}, {"tb_lasti", T_INT, OFF(tb_lasti), READONLY}, - {"tb_lineno", T_INT, OFF(tb_lineno), READONLY}, {NULL} /* Sentinel */ }; static PyGetSetDef tb_getsetters[] = { {"tb_next", (getter)tb_next_get, (setter)tb_next_set, NULL, NULL}, + {"tb_lineno", (getter)tb_lineno_get, NULL, NULL, NULL}, {NULL} /* Sentinel */ }; @@ -241,8 +261,7 @@ _PyTraceBack_FromFrame(PyObject *tb_next, PyFrameObject *frame) assert(tb_next == NULL || PyTraceBack_Check(tb_next)); assert(frame != NULL); int addr = _PyInterpreterFrame_LASTI(frame->f_frame) * sizeof(_Py_CODEUNIT); - return tb_create_raw((PyTracebackObject *)tb_next, frame, addr, - PyFrame_GetLineNumber(frame)); + return tb_create_raw((PyTracebackObject *)tb_next, frame, addr, -1); } @@ -990,9 +1009,13 @@ tb_printinternal(PyTracebackObject *tb, PyObject *f, long limit, } while (tb != NULL) { code = PyFrame_GetCode(tb->tb_frame); + int tb_lineno = tb->tb_lineno; + if (tb_lineno == -1) { + tb_lineno = tb_get_lineno(tb); + } if (last_file == NULL || code->co_filename != last_file || - last_line == -1 || tb->tb_lineno != last_line || + last_line == -1 || tb_lineno != last_line || last_name == NULL || code->co_name != last_name) { if (cnt > TB_RECURSIVE_CUTOFF) { if (tb_print_line_repeated(f, cnt) < 0) { @@ -1000,13 +1023,13 @@ tb_printinternal(PyTracebackObject *tb, PyObject *f, long limit, } } last_file = code->co_filename; - last_line = tb->tb_lineno; + last_line = tb_lineno; last_name = code->co_name; cnt = 0; } cnt++; if (cnt <= TB_RECURSIVE_CUTOFF) { - if (tb_displayline(tb, f, code->co_filename, tb->tb_lineno, + if (tb_displayline(tb, f, code->co_filename, tb_lineno, tb->tb_frame, code->co_name, indent, margin) < 0) { goto error; } From be21fb6369d0c1ab3f7642e734fa46594dcbbf8c Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 31 Oct 2023 17:24:17 +0100 Subject: [PATCH 158/265] [3.11] gh-106861: Docs: Add availability directives to all Unix-only modules (GH-108975) (#111554) Co-authored-by: xzmeng --- Doc/library/fcntl.rst | 2 +- Doc/library/grp.rst | 2 +- Doc/library/posix.rst | 2 ++ Doc/library/pty.rst | 2 ++ Doc/library/pwd.rst | 2 +- Doc/library/resource.rst | 2 +- Doc/library/syslog.rst | 4 ++-- Doc/library/termios.rst | 2 ++ Doc/library/tty.rst | 2 ++ 9 files changed, 14 insertions(+), 6 deletions(-) diff --git a/Doc/library/fcntl.rst b/Doc/library/fcntl.rst index 1951f80f91e663..c417a84e048f8a 100644 --- a/Doc/library/fcntl.rst +++ b/Doc/library/fcntl.rst @@ -18,7 +18,7 @@ interface to the :c:func:`fcntl` and :c:func:`ioctl` Unix routines. For a complete description of these calls, see :manpage:`fcntl(2)` and :manpage:`ioctl(2)` Unix manual pages. -.. include:: ../includes/wasm-notavail.rst +.. availability:: Unix, not Emscripten, not WASI. All functions in this module take a file descriptor *fd* as their first argument. This can be an integer file descriptor, such as returned by diff --git a/Doc/library/grp.rst b/Doc/library/grp.rst index 14af744e3ae8f4..ee55b12ea8643a 100644 --- a/Doc/library/grp.rst +++ b/Doc/library/grp.rst @@ -10,7 +10,7 @@ This module provides access to the Unix group database. It is available on all Unix versions. -.. include:: ../includes/wasm-notavail.rst +.. availability:: Unix, not Emscripten, not WASI. Group database entries are reported as a tuple-like object, whose attributes correspond to the members of the ``group`` structure (Attribute field below, see diff --git a/Doc/library/posix.rst b/Doc/library/posix.rst index 0413f9d02a8d57..5871574b442667 100644 --- a/Doc/library/posix.rst +++ b/Doc/library/posix.rst @@ -11,6 +11,8 @@ This module provides access to operating system functionality that is standardized by the C Standard and the POSIX standard (a thinly disguised Unix interface). +.. availability:: Unix. + .. index:: pair: module; os **Do not import this module directly.** Instead, import the module :mod:`os`, diff --git a/Doc/library/pty.rst b/Doc/library/pty.rst index ad4981c97119fa..af9378464edb9f 100644 --- a/Doc/library/pty.rst +++ b/Doc/library/pty.rst @@ -16,6 +16,8 @@ The :mod:`pty` module defines operations for handling the pseudo-terminal concept: starting another process and being able to write to and read from its controlling terminal programmatically. +.. availability:: Unix. + Pseudo-terminal handling is highly platform dependent. This code is mainly tested on Linux, FreeBSD, and macOS (it is supposed to work on other POSIX platforms but it's not been thoroughly tested). diff --git a/Doc/library/pwd.rst b/Doc/library/pwd.rst index 7cafc66fd7e93c..755f0d29ac7345 100644 --- a/Doc/library/pwd.rst +++ b/Doc/library/pwd.rst @@ -10,7 +10,7 @@ This module provides access to the Unix user account and password database. It is available on all Unix versions. -.. include:: ../includes/wasm-notavail.rst +.. availability:: Unix, not Emscripten, not WASI. Password database entries are reported as a tuple-like object, whose attributes correspond to the members of the ``passwd`` structure (Attribute field below, diff --git a/Doc/library/resource.rst b/Doc/library/resource.rst index a5324c82c63484..ef65674d1b0a78 100644 --- a/Doc/library/resource.rst +++ b/Doc/library/resource.rst @@ -13,7 +13,7 @@ This module provides basic mechanisms for measuring and controlling system resources utilized by a program. -.. include:: ../includes/wasm-notavail.rst +.. availability:: Unix, not Emscripten, not WASI. Symbolic constants are used to specify particular system resources and to request usage information about either the current process or its children. diff --git a/Doc/library/syslog.rst b/Doc/library/syslog.rst index 766ff57cc66d69..889bbb39d58232 100644 --- a/Doc/library/syslog.rst +++ b/Doc/library/syslog.rst @@ -11,12 +11,12 @@ This module provides an interface to the Unix ``syslog`` library routines. Refer to the Unix manual pages for a detailed description of the ``syslog`` facility. +.. availability:: Unix, not Emscripten, not WASI. + This module wraps the system ``syslog`` family of routines. A pure Python library that can speak to a syslog server is available in the :mod:`logging.handlers` module as :class:`SysLogHandler`. -.. include:: ../includes/wasm-notavail.rst - The module defines the following functions: diff --git a/Doc/library/termios.rst b/Doc/library/termios.rst index 03806178e9d3fb..57705ddc4e6470 100644 --- a/Doc/library/termios.rst +++ b/Doc/library/termios.rst @@ -16,6 +16,8 @@ complete description of these calls, see :manpage:`termios(3)` Unix manual page. It is only available for those Unix versions that support POSIX *termios* style tty I/O control configured during installation. +.. availability:: Unix. + All functions in this module take a file descriptor *fd* as their first argument. This can be an integer file descriptor, such as returned by ``sys.stdin.fileno()``, or a :term:`file object`, such as ``sys.stdin`` itself. diff --git a/Doc/library/tty.rst b/Doc/library/tty.rst index b30bc3c7ac42e9..75ba6c4523e5e7 100644 --- a/Doc/library/tty.rst +++ b/Doc/library/tty.rst @@ -15,6 +15,8 @@ The :mod:`tty` module defines functions for putting the tty into cbreak and raw modes. +.. availability:: Unix. + Because it requires the :mod:`termios` module, it will work only on Unix. The :mod:`tty` module defines the following functions: From 650eb29b4941efae6405478763b605c5852cdc61 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 31 Oct 2023 17:24:41 +0100 Subject: [PATCH 159/265] [3.11] gh-93607: document `root` attribute of `iterparse` (GH-99410) (#111556) Co-authored-by: Prometheus3375 <35541026+Prometheus3375@users.noreply.github.com> Co-authored-by: Stanley <46876382+slateny@users.noreply.github.com> Co-authored-by: Hugo van Kemenade --- Doc/library/xml.etree.elementtree.rst | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/Doc/library/xml.etree.elementtree.rst b/Doc/library/xml.etree.elementtree.rst index 7fd4c7a4a5da2b..31135a7e613c97 100644 --- a/Doc/library/xml.etree.elementtree.rst +++ b/Doc/library/xml.etree.elementtree.rst @@ -622,7 +622,9 @@ Functions *parser* is an optional parser instance. If not given, the standard :class:`XMLParser` parser is used. *parser* must be a subclass of :class:`XMLParser` and can only use the default :class:`TreeBuilder` as a - target. Returns an :term:`iterator` providing ``(event, elem)`` pairs. + target. Returns an :term:`iterator` providing ``(event, elem)`` pairs; + it has a ``root`` attribute that references the root element of the + resulting XML tree once *source* is fully read. Note that while :func:`iterparse` builds the tree incrementally, it issues blocking reads on *source* (or the file it names). As such, it's unsuitable From 2bb10acfdd9782e36534a1cb95bec520d0338ab6 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 31 Oct 2023 17:40:17 +0100 Subject: [PATCH 160/265] [3.11] gh-102249: Expand sys.call_tracing documentation (GH-102806) (#111558) Co-authored-by: Quentin Peter Co-authored-by: C.A.M. Gerlach --- Doc/library/sys.rst | 20 +++++++++++++++++--- 1 file changed, 17 insertions(+), 3 deletions(-) diff --git a/Doc/library/sys.rst b/Doc/library/sys.rst index 064a9c47137b30..f5b94f567ff749 100644 --- a/Doc/library/sys.rst +++ b/Doc/library/sys.rst @@ -173,7 +173,11 @@ always available. Call ``func(*args)``, while tracing is enabled. The tracing state is saved, and restored afterwards. This is intended to be called from a debugger from - a checkpoint, to recursively debug some other code. + a checkpoint, to recursively debug or profile some other code. + + Tracing is suspended while calling a tracing function set by + :func:`settrace` or :func:`setprofile` to avoid infinite recursion. + :func:`!call_tracing` enables explicit recursion of the tracing function. .. data:: copyright @@ -1439,13 +1443,16 @@ always available. its return value is not used, so it can simply return ``None``. Error in the profile function will cause itself unset. + .. note:: + The same tracing mechanism is used for :func:`!setprofile` as :func:`settrace`. + To trace calls with :func:`!setprofile` inside a tracing function + (e.g. in a debugger breakpoint), see :func:`call_tracing`. + Profile functions should have three arguments: *frame*, *event*, and *arg*. *frame* is the current stack frame. *event* is a string: ``'call'``, ``'return'``, ``'c_call'``, ``'c_return'``, or ``'c_exception'``. *arg* depends on the event type. - .. audit-event:: sys.setprofile "" sys.setprofile - The events have the following meaning: ``'call'`` @@ -1467,6 +1474,9 @@ always available. ``'c_exception'`` A C function has raised an exception. *arg* is the C function object. + .. audit-event:: sys.setprofile "" sys.setprofile + + .. function:: setrecursionlimit(limit) Set the maximum depth of the Python interpreter stack to *limit*. This limit @@ -1526,6 +1536,10 @@ always available. If there is any error occurred in the trace function, it will be unset, just like ``settrace(None)`` is called. + .. note:: + Tracing is disabled while calling the trace function (e.g. a function set by + :func:`!settrace`). For recursive tracing see :func:`call_tracing`. + The events have the following meaning: ``'call'`` From 1d7ad7780a9365bddfa6ba9dbba2e390ca55fba7 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 1 Nov 2023 04:57:23 +0100 Subject: [PATCH 161/265] [3.11] gh-110367: Make regrtest --verbose3 compatible with --huntrleaks -jN (GH-111577) (#111590) gh-110367: Make regrtest --verbose3 compatible with --huntrleaks -jN (GH-111577) "./python -m test -j1 -R 3:3 --verbose3" now works as expected, since run_single_test() does not replace sys.stdout with StringIO in this case. (cherry picked from commit d9a5530d2327efa1fe66a04d31b5c67e42dbcd9c) Co-authored-by: Victor Stinner --- Lib/test/libregrtest/cmdline.py | 10 ++++++-- Lib/test/test_regrtest.py | 23 +++++++++++++++++++ ...-10-31-22-09-25.gh-issue-110367.UhQi44.rst | 3 +++ 3 files changed, 34 insertions(+), 2 deletions(-) create mode 100644 Misc/NEWS.d/next/Tests/2023-10-31-22-09-25.gh-issue-110367.UhQi44.rst diff --git a/Lib/test/libregrtest/cmdline.py b/Lib/test/libregrtest/cmdline.py index 905b3f862281f1..9ba6ec63cad9f8 100644 --- a/Lib/test/libregrtest/cmdline.py +++ b/Lib/test/libregrtest/cmdline.py @@ -494,10 +494,16 @@ def _parse_args(args, **kwargs): ns.randomize = True if ns.verbose: ns.header = True - if ns.huntrleaks and ns.verbose3: + # When -jN option is used, a worker process does not use --verbose3 + # and so -R 3:3 -jN --verbose3 just works as expected: there is no false + # alarm about memory leak. + if ns.huntrleaks and ns.verbose3 and ns.use_mp is None: ns.verbose3 = False + # run_single_test() replaces sys.stdout with io.StringIO if verbose3 + # is true. In this case, huntrleaks sees an write into StringIO as + # a memory leak, whereas it is not (gh-71290). print("WARNING: Disable --verbose3 because it's incompatible with " - "--huntrleaks: see http://bugs.python.org/issue27103", + "--huntrleaks without -jN option", file=sys.stderr) if ns.forever: # --forever implies --failfast diff --git a/Lib/test/test_regrtest.py b/Lib/test/test_regrtest.py index aa3f65487a288e..2f1bb03bc0ba4e 100644 --- a/Lib/test/test_regrtest.py +++ b/Lib/test/test_regrtest.py @@ -2120,6 +2120,29 @@ def test_crash(self): self.assertIn(f"Exit code {exitcode} (SIGSEGV)", output) self.check_line(output, "just before crash!", full=True, regex=False) + def test_verbose3(self): + code = textwrap.dedent(r""" + import unittest + from test import support + + class VerboseTests(unittest.TestCase): + def test_pass(self): + print("SPAM SPAM SPAM") + """) + testname = self.create_test(code=code) + + # Run sequentially + output = self.run_tests("--verbose3", testname) + self.check_executed_tests(output, testname, stats=1) + self.assertNotIn('SPAM SPAM SPAM', output) + + # -R option needs a debug build + if support.Py_DEBUG: + # Check for reference leaks, run in parallel + output = self.run_tests("-R", "3:3", "-j1", "--verbose3", testname) + self.check_executed_tests(output, testname, stats=1, parallel=True) + self.assertNotIn('SPAM SPAM SPAM', output) + class TestUtils(unittest.TestCase): def test_format_duration(self): diff --git a/Misc/NEWS.d/next/Tests/2023-10-31-22-09-25.gh-issue-110367.UhQi44.rst b/Misc/NEWS.d/next/Tests/2023-10-31-22-09-25.gh-issue-110367.UhQi44.rst new file mode 100644 index 00000000000000..0254288d3626cc --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-10-31-22-09-25.gh-issue-110367.UhQi44.rst @@ -0,0 +1,3 @@ +Make regrtest ``--verbose3`` option compatible with ``--huntrleaks -jN`` +options. The ``./python -m test -j1 -R 3:3 --verbose3`` command now works as +expected. Patch by Victor Stinner. From cf6145453ff3a7bdc0790ac6d95b5b097c489d45 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 1 Nov 2023 11:43:37 +0100 Subject: [PATCH 162/265] [3.11] gh-111576: Improve documention for tkinter.messagebox (GH-111578) (GH-111598) (cherry picked from commit eaf67e37a2da28c1241362e3b4ff1202945c83c5) Co-authored-by: Serhiy Storchaka --- Doc/library/tkinter.messagebox.rst | 175 +++++++++++++++++++++++++++-- 1 file changed, 165 insertions(+), 10 deletions(-) diff --git a/Doc/library/tkinter.messagebox.rst b/Doc/library/tkinter.messagebox.rst index 56c1d6c132afd2..56090a0a0e424b 100644 --- a/Doc/library/tkinter.messagebox.rst +++ b/Doc/library/tkinter.messagebox.rst @@ -11,7 +11,8 @@ The :mod:`tkinter.messagebox` module provides a template base class as well as a variety of convenience methods for commonly used configurations. The message -boxes are modal and will return a subset of (True, False, OK, None, Yes, No) based on +boxes are modal and will return a subset of (``True``, ``False``, ``None``, +:data:`OK`, :data:`CANCEL`, :data:`YES`, :data:`NO`) based on the user's selection. Common message box styles and layouts include but are not limited to: @@ -19,21 +20,175 @@ limited to: .. class:: Message(master=None, **options) - Create a default information message box. + Create a message window with an application-specified message, an icon + and a set of buttons. + Each of the buttons in the message window is identified by a unique symbolic name (see the *type* options). + + The following options are supported: + + *command* + Specifies the function to invoke when the user closes the dialog. + The name of the button clicked by the user to close the dialog is + passed as argument. + This is only available on macOS. + + *default* + Gives the :ref:`symbolic name ` of the default button + for this message window (:data:`OK`, :data:`CANCEL`, and so on). + If this option is not specified, the first button in the dialog will + be made the default. + + *detail* + Specifies an auxiliary message to the main message given by the + *message* option. + The message detail will be presented beneath the main message and, + where supported by the OS, in a less emphasized font than the main + message. + + *icon* + Specifies an :ref:`icon ` to display. + If this option is not specified, then the :data:`INFO` icon will be + displayed. + + *message* + Specifies the message to display in this message box. + The default value is an empty string. + + *parent* + Makes the specified window the logical parent of the message box. + The message box is displayed on top of its parent window. + + *title* + Specifies a string to display as the title of the message box. + This option is ignored on macOS, where platform guidelines forbid + the use of a title on this kind of dialog. + + *type* + Arranges for a :ref:`predefined set of buttons ` + to be displayed. + + .. method:: show(**options) + + Display a message window and wait for the user to select one of the buttons. Then return the symbolic name of the selected button. + Keyword arguments can override options specified in the constructor. + **Information message box** -.. method:: showinfo(title=None, message=None, **options) +.. function:: showinfo(title=None, message=None, **options) + + Creates and displays an information message box with the specified title + and message. **Warning message boxes** -.. method:: showwarning(title=None, message=None, **options) - showerror(title=None, message=None, **options) +.. function:: showwarning(title=None, message=None, **options) + + Creates and displays a warning message box with the specified title + and message. + +.. function:: showerror(title=None, message=None, **options) + + Creates and displays an error message box with the specified title + and message. **Question message boxes** -.. method:: askquestion(title=None, message=None, **options) - askokcancel(title=None, message=None, **options) - askretrycancel(title=None, message=None, **options) - askyesno(title=None, message=None, **options) - askyesnocancel(title=None, message=None, **options) +.. function:: askquestion(title=None, message=None, *, type=YESNO, **options) + + Ask a question. By default shows buttons :data:`YES` and :data:`NO`. + Returns the symbolic name of the selected button. + +.. function:: askokcancel(title=None, message=None, **options) + + Ask if operation should proceed. Shows buttons :data:`OK` and :data:`CANCEL`. + Returns ``True`` if the answer is ok and ``False`` otherwise. + +.. function:: askretrycancel(title=None, message=None, **options) + + Ask if operation should be retried. Shows buttons :data:`RETRY` and :data:`CANCEL`. + Return ``True`` if the answer is yes and ``False`` otherwise. + +.. function:: askyesno(title=None, message=None, **options) + + Ask a question. Shows buttons :data:`YES` and :data:`NO`. + Returns ``True`` if the answer is yes and ``False`` otherwise. + +.. function:: askyesnocancel(title=None, message=None, **options) + + Ask a question. Shows buttons :data:`YES`, :data:`NO` and :data:`CANCEL`. + Return ``True`` if the answer is yes, ``None`` if cancelled, and ``False`` + otherwise. + + +.. _messagebox-buttons: + +Symbolic names of buttons: + +.. data:: ABORT + :value: 'abort' +.. data:: RETRY + :value: 'retry' +.. data:: IGNORE + :value: 'ignore' +.. data:: OK + :value: 'ok' +.. data:: CANCEL + :value: 'cancel' +.. data:: YES + :value: 'yes' +.. data:: NO + :value: 'no' + +.. _messagebox-types: + +Predefined sets of buttons: + +.. data:: ABORTRETRYIGNORE + :value: 'abortretryignore' + + Displays three buttons whose symbolic names are :data:`ABORT`, + :data:`RETRY` and :data:`IGNORE`. + +.. data:: OK + :value: 'ok' + :noindex: + + Displays one button whose symbolic name is :data:`OK`. + +.. data:: OKCANCEL + :value: 'okcancel' + + Displays two buttons whose symbolic names are :data:`OK` and + :data:`CANCEL`. + +.. data:: RETRYCANCEL + :value: 'retrycancel' + + Displays two buttons whose symbolic names are :data:`RETRY` and + :data:`CANCEL`. + +.. data:: YESNO + :value: 'yesno' + + Displays two buttons whose symbolic names are :data:`YES` and + :data:`NO`. + +.. data:: YESNOCANCEL + :value: 'yesnocancel' + + Displays three buttons whose symbolic names are :data:`YES`, + :data:`NO` and :data:`CANCEL`. + +.. _messagebox-icons: + +Icon images: + +.. data:: ERROR + :value: 'error' +.. data:: INFO + :value: 'info' +.. data:: QUESTION + :value: 'question' +.. data:: WARNING + :value: 'warning' From cfdcc96aba90fca40acea63c3aac4dd4be8c8c1e Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 2 Nov 2023 12:36:35 +0100 Subject: [PATCH 163/265] [3.11] gh-111625: Fix link to Info-ZIP homepage (GH-111626) (#111640) Co-authored-by: partev --- Doc/tools/templates/download.html | 4 ++-- Lib/test/ziptestdata/README.md | 4 ++-- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/Doc/tools/templates/download.html b/Doc/tools/templates/download.html index 7920e0619f9337..b5353d6fb77ab4 100644 --- a/Doc/tools/templates/download.html +++ b/Doc/tools/templates/download.html @@ -49,12 +49,12 @@

Unpacking

Unix users should download the .tar.bz2 archives; these are bzipped tar archives and can be handled in the usual way using tar and the bzip2 -program. The InfoZIP unzip program can be +program. The Info-ZIP unzip program can be used to handle the ZIP archives if desired. The .tar.bz2 archives provide the best compression and fastest download times.

Windows users can use the ZIP archives since those are customary on that -platform. These are created on Unix using the InfoZIP zip program.

+platform. These are created on Unix using the Info-ZIP zip program.

Problems

diff --git a/Lib/test/ziptestdata/README.md b/Lib/test/ziptestdata/README.md index 6b9147db76e178..00d96d445bf543 100644 --- a/Lib/test/ziptestdata/README.md +++ b/Lib/test/ziptestdata/README.md @@ -1,7 +1,7 @@ # Test data for `test_zipfile` The test executables in this directory are created manually from header.sh and -the `testdata_module_inside_zip.py` file. You must have infozip's zip utility +the `testdata_module_inside_zip.py` file. You must have Info-ZIP's zip utility installed (`apt install zip` on Debian). ## Purpose @@ -25,7 +25,7 @@ rm zip2.zip ### Modern format (4.5) zip64 file -Redirecting from stdin forces infozip's zip tool to create a zip64. +Redirecting from stdin forces Info-ZIP's zip tool to create a zip64. ``` zip -0 zip64.zip From e61f2edfe8ebfea4de5f0b55f3b790fdf07998d7 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 3 Nov 2023 00:03:02 +0100 Subject: [PATCH 164/265] [3.11] Fix typo in documentation of `SysLogHandler.createSocket` (GH-111665) (#111670) (cherry picked from commit 489b80640ff9c4f10b25da6d562b06c62a10a76b) --- Doc/library/logging.handlers.rst | 4 +--- 1 file changed, 1 insertion(+), 3 deletions(-) diff --git a/Doc/library/logging.handlers.rst b/Doc/library/logging.handlers.rst index f9b3b9b9abe656..bb30f716dd3d39 100644 --- a/Doc/library/logging.handlers.rst +++ b/Doc/library/logging.handlers.rst @@ -656,9 +656,7 @@ supports sending logging messages to a remote or local Unix syslog. to the other end. This method is called during handler initialization, but it's not regarded as an error if the other end isn't listening at this point - the method will be called again when emitting an event, if - but it's not regarded as an error if the other end isn't listening yet - --- the method will be called again when emitting an event, - if there is no socket at that point. + there is no socket at that point. .. versionadded:: 3.11 From a106f61b728b61269ffff5d0909be044e7c8c41e Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 3 Nov 2023 07:28:53 +0100 Subject: [PATCH 165/265] [3.11] gh-54434: Make difflib.rst doctests pass. (GH-111677) (#111679) gh-54434: Make difflib.rst doctests pass. (GH-111677) (cherry picked from commit 0d3df272fbd131bff7f02d4d4279ad1e35081121) Co-authored-by: Terry Jan Reedy --- Doc/library/difflib.rst | 8 +++++--- 1 file changed, 5 insertions(+), 3 deletions(-) diff --git a/Doc/library/difflib.rst b/Doc/library/difflib.rst index 431ee9c2c6ce17..83c669e50f1f85 100644 --- a/Doc/library/difflib.rst +++ b/Doc/library/difflib.rst @@ -173,9 +173,12 @@ diffs. For comparing directories and files, see also, the :mod:`filecmp` module. expressed in the ISO 8601 format. If not specified, the strings default to blanks. + >>> import sys + >>> from difflib import * >>> s1 = ['bacon\n', 'eggs\n', 'ham\n', 'guido\n'] >>> s2 = ['python\n', 'eggy\n', 'hamster\n', 'guido\n'] - >>> sys.stdout.writelines(context_diff(s1, s2, fromfile='before.py', tofile='after.py')) + >>> sys.stdout.writelines(context_diff(s1, s2, fromfile='before.py', + ... tofile='after.py')) *** before.py --- after.py *************** @@ -298,13 +301,12 @@ diffs. For comparing directories and files, see also, the :mod:`filecmp` module. For inputs that do not have trailing newlines, set the *lineterm* argument to ``""`` so that the output will be uniformly newline free. - The context diff format normally has a header for filenames and modification + The unified diff format normally has a header for filenames and modification times. Any or all of these may be specified using strings for *fromfile*, *tofile*, *fromfiledate*, and *tofiledate*. The modification times are normally expressed in the ISO 8601 format. If not specified, the strings default to blanks. - >>> s1 = ['bacon\n', 'eggs\n', 'ham\n', 'guido\n'] >>> s2 = ['python\n', 'eggy\n', 'hamster\n', 'guido\n'] >>> sys.stdout.writelines(unified_diff(s1, s2, fromfile='before.py', tofile='after.py')) From 4a6116967228f8399d4e5bc4dfed6f6da81202e3 Mon Sep 17 00:00:00 2001 From: Ethan Furman Date: Fri, 3 Nov 2023 16:51:56 -0700 Subject: [PATCH 166/265] [3.11] gh-111181: Fix enum doctests (GH-111180) (GH-111617) gh-111181: Fix enum doctests (GH-111180) Co-authored-by: Ethan Furman (cherry picked from commit c4dc5a6ae8aa13abb743182df088f1a3526d1bcd) Co-authored-by: Nikita Sobolev --- Doc/howto/enum.rst | 8 +++++--- Lib/enum.py | 11 +++++------ Lib/test/test_enum.py | 17 +++++++++++------ 3 files changed, 21 insertions(+), 15 deletions(-) diff --git a/Doc/howto/enum.rst b/Doc/howto/enum.rst index 58849429539cca..a714f6d9aa3c7e 100644 --- a/Doc/howto/enum.rst +++ b/Doc/howto/enum.rst @@ -494,7 +494,8 @@ It is possible to modify how enum members are pickled/unpickled by defining :meth:`__reduce_ex__` in the enumeration class. The default method is by-value, but enums with complicated values may want to use by-name:: - >>> class MyEnum(Enum): + >>> import enum + >>> class MyEnum(enum.Enum): ... __reduce_ex__ = enum.pickle_by_enum_name .. note:: @@ -736,7 +737,7 @@ be combined with them (but may lose :class:`IntFlag` membership:: >>> Perm.X | 4 - >>> Perm.X | 8 + >>> Perm.X + 8 9 .. note:: @@ -1398,8 +1399,9 @@ alias:: ... GRENE = 2 ... Traceback (most recent call last): - ... + ... ValueError: aliases not allowed in DuplicateFreeEnum: 'GRENE' --> 'GREEN' + Error calling __set_name__ on '_proto_member' instance 'GRENE' in 'Color' .. note:: diff --git a/Lib/enum.py b/Lib/enum.py index 155cb13022fa16..63833aaa717feb 100644 --- a/Lib/enum.py +++ b/Lib/enum.py @@ -1197,14 +1197,13 @@ def __str__(self): def __dir__(self): """ - Returns all members and all public methods + Returns public methods and other interesting attributes. """ - if self.__class__._member_type_ is object: - interesting = set(['__class__', '__doc__', '__eq__', '__hash__', '__module__', 'name', 'value']) - else: + interesting = set() + if self.__class__._member_type_ is not object: interesting = set(object.__dir__(self)) for name in getattr(self, '__dict__', []): - if name[0] != '_': + if name[0] != '_' and name not in self._member_map_: interesting.add(name) for cls in self.__class__.mro(): for name, obj in cls.__dict__.items(): @@ -1217,7 +1216,7 @@ def __dir__(self): else: # in case it was added by `dir(self)` interesting.discard(name) - else: + elif name not in self._member_map_: interesting.add(name) names = sorted( set(['__class__', '__doc__', '__eq__', '__hash__', '__module__']) diff --git a/Lib/test/test_enum.py b/Lib/test/test_enum.py index ed1c3a59ce479c..ab2b45853c2214 100644 --- a/Lib/test/test_enum.py +++ b/Lib/test/test_enum.py @@ -18,7 +18,7 @@ from io import StringIO from pickle import dumps, loads, PicklingError, HIGHEST_PROTOCOL from test import support -from test.support import ALWAYS_EQ +from test.support import ALWAYS_EQ, REPO_ROOT from test.support import threading_helper from textwrap import dedent from datetime import timedelta @@ -27,14 +27,19 @@ def load_tests(loader, tests, ignore): tests.addTests(doctest.DocTestSuite(enum)) - if os.path.exists('Doc/library/enum.rst'): + + lib_tests = os.path.join(REPO_ROOT, 'Doc/library/enum.rst') + if os.path.exists(lib_tests): tests.addTests(doctest.DocFileSuite( - '../../Doc/library/enum.rst', + lib_tests, + module_relative=False, optionflags=doctest.ELLIPSIS|doctest.NORMALIZE_WHITESPACE, )) - if os.path.exists('Doc/howto/enum.rst'): + howto_tests = os.path.join(REPO_ROOT, 'Doc/howto/enum.rst') + if os.path.exists(howto_tests): tests.addTests(doctest.DocFileSuite( - '../../Doc/howto/enum.rst', + howto_tests, + module_relative=False, optionflags=doctest.ELLIPSIS|doctest.NORMALIZE_WHITESPACE, )) return tests @@ -4874,7 +4879,7 @@ def member_dir(member): allowed.add(name) else: allowed.discard(name) - else: + elif name not in member._member_map_: allowed.add(name) return sorted(allowed) From 89264a31eaeb71a1349edba96c090bcf026b498c Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 4 Nov 2023 01:43:53 +0100 Subject: [PATCH 167/265] [3.11] gh-111644: Fix support threading_cleanup() (GH-111714) (#111717) gh-111644: Fix support threading_cleanup() (GH-111714) Copy the list of dangling threads to make sure that the list of "Dangling thread" is complete. Previously, the list was incomplete if threads completed just before the list was displayed. Changes: * Rewrite the warning to make it easier to understand. * Use support.sleeping_retry(). * threading_cleanup() no longer copies threading._dangling, but only counts the number of dangling thread. * Remove support.gc_support() call. (cherry picked from commit f62c7ccf9abf6e0493978da9cf9ca43adcd403f9) Co-authored-by: Victor Stinner --- Lib/test/support/threading_helper.py | 53 +++++++++++++++------------- 1 file changed, 28 insertions(+), 25 deletions(-) diff --git a/Lib/test/support/threading_helper.py b/Lib/test/support/threading_helper.py index b9973c8bf5c914..6815ab36ec7a9f 100644 --- a/Lib/test/support/threading_helper.py +++ b/Lib/test/support/threading_helper.py @@ -22,34 +22,37 @@ def threading_setup(): - return _thread._count(), threading._dangling.copy() + return _thread._count(), len(threading._dangling) def threading_cleanup(*original_values): - _MAX_COUNT = 100 - - for count in range(_MAX_COUNT): - values = _thread._count(), threading._dangling - if values == original_values: - break - - if not count: - # Display a warning at the first iteration - support.environment_altered = True - dangling_threads = values[1] - support.print_warning(f"threading_cleanup() failed to cleanup " - f"{values[0] - original_values[0]} threads " - f"(count: {values[0]}, " - f"dangling: {len(dangling_threads)})") - for thread in dangling_threads: - support.print_warning(f"Dangling thread: {thread!r}") - - # Don't hold references to threads - dangling_threads = None - values = None - - time.sleep(0.01) - support.gc_collect() + orig_count, orig_ndangling = original_values + + timeout = 1.0 + for _ in support.sleeping_retry(timeout, error=False): + # Copy the thread list to get a consistent output. threading._dangling + # is a WeakSet, its value changes when it's read. + dangling_threads = list(threading._dangling) + count = _thread._count() + + if count <= orig_count: + return + + # Timeout! + support.environment_altered = True + support.print_warning( + f"threading_cleanup() failed to clean up threads " + f"in {timeout:.1f} seconds\n" + f" before: thread count={orig_count}, dangling={orig_ndangling}\n" + f" after: thread count={count}, dangling={len(dangling_threads)}") + for thread in dangling_threads: + support.print_warning(f"Dangling thread: {thread!r}") + + # The warning happens when a test spawns threads and some of these threads + # are still running after the test completes. To fix this warning, join + # threads explicitly to wait until they complete. + # + # To make the warning more likely, reduce the timeout. def reap_threads(func): From f7ffe4a8eac64dedd2950e334f870a5c111e4be7 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 4 Nov 2023 02:23:01 +0100 Subject: [PATCH 168/265] [3.11] [3.12] GH-110894: Call loop exception handler for exceptions in client_connected_cb (GH-111601) (GH-111632) (#111634) * [3.12] GH-110894: Call loop exception handler for exceptions in client_connected_cb (GH-111601) (GH-111632) (cherry picked from commit 9aa88290d82e2808eed84e7a63d0bf9623f84f53) Co-authored-by: Kumar Aditya Call loop exception handler for exceptions in `client_connected_cb` of `asyncio.start_server` so that applications can handle it.. (cherry picked from commit 229f44d353c71185414a072017f46f125676bdd6) * gh-111644: Fix asyncio test_unhandled_exceptions() (#111713) Fix test_unhandled_exceptions() of test_asyncio.test_streams: break explicitly a reference cycle. Fix also StreamTests.tearDown(): the loop must not be closed explicitly, but using set_event_loop() which takes care of shutting down the executor with executor.shutdown(wait=True). BaseEventLoop.close() calls executor.shutdown(wait=False). (cherry picked from commit ac01e2243a1104b2154c0d1bdbc9f8d5b3ada778) --------- Co-authored-by: Kumar Aditya Co-authored-by: Victor Stinner --- Lib/asyncio/streams.py | 12 +++++++ Lib/test/test_asyncio/test_streams.py | 34 +++++++++++++++++-- ...-11-01-14-03-24.gh-issue-110894.7-wZxC.rst | 1 + 3 files changed, 45 insertions(+), 2 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-11-01-14-03-24.gh-issue-110894.7-wZxC.rst diff --git a/Lib/asyncio/streams.py b/Lib/asyncio/streams.py index 19d9154b66d027..7c407067e05a16 100644 --- a/Lib/asyncio/streams.py +++ b/Lib/asyncio/streams.py @@ -245,7 +245,19 @@ def connection_made(self, transport): res = self._client_connected_cb(reader, self._stream_writer) if coroutines.iscoroutine(res): + def callback(task): + exc = task.exception() + if exc is not None: + self._loop.call_exception_handler({ + 'message': 'Unhandled exception in client_connected_cb', + 'exception': exc, + 'transport': transport, + }) + transport.close() + self._task = self._loop.create_task(res) + self._task.add_done_callback(callback) + self._strong_reader = None def connection_lost(self, exc): diff --git a/Lib/test/test_asyncio/test_streams.py b/Lib/test/test_asyncio/test_streams.py index 95fc7a159baee8..86354306f1fff3 100644 --- a/Lib/test/test_asyncio/test_streams.py +++ b/Lib/test/test_asyncio/test_streams.py @@ -36,8 +36,7 @@ def tearDown(self): # just in case if we have transport close callbacks test_utils.run_briefly(self.loop) - self.loop.close() - gc.collect() + # set_event_loop() takes care of closing self.loop in a safe way super().tearDown() def _basetest_open_connection(self, open_connection_fut): @@ -1068,6 +1067,37 @@ def test_eof_feed_when_closing_writer(self): self.assertEqual(messages, []) + def test_unhandled_exceptions(self) -> None: + port = socket_helper.find_unused_port() + + messages = [] + self.loop.set_exception_handler(lambda loop, ctx: messages.append(ctx)) + + async def client(): + rd, wr = await asyncio.open_connection('localhost', port) + wr.write(b'test msg') + await wr.drain() + wr.close() + await wr.wait_closed() + + async def main(): + async def handle_echo(reader, writer): + raise Exception('test') + + server = await asyncio.start_server( + handle_echo, 'localhost', port) + await server.start_serving() + await client() + server.close() + await server.wait_closed() + + self.loop.run_until_complete(main()) + + self.assertEqual(messages[0]['message'], + 'Unhandled exception in client_connected_cb') + # Break explicitly reference cycle + messages = None + if __name__ == '__main__': unittest.main() diff --git a/Misc/NEWS.d/next/Library/2023-11-01-14-03-24.gh-issue-110894.7-wZxC.rst b/Misc/NEWS.d/next/Library/2023-11-01-14-03-24.gh-issue-110894.7-wZxC.rst new file mode 100644 index 00000000000000..c59fe6b9119eca --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-11-01-14-03-24.gh-issue-110894.7-wZxC.rst @@ -0,0 +1 @@ +Call loop exception handler for exceptions in ``client_connected_cb`` of :func:`asyncio.start_server` so that applications can handle it. Patch by Kumar Aditya. From 425efc11828a8713057e75f04e8d7d889e2f20db Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 4 Nov 2023 11:12:15 +0100 Subject: [PATCH 169/265] [3.11] gh-111159: Fix `SyntaxError` doctests for non-builtin exception classes (GH-111541) (#111733) gh-111159: Fix `SyntaxError` doctests for non-builtin exception classes (GH-111541) (cherry picked from commit 18c954849bcdd5acb6ef91cd90d92f3b5c685134) Co-authored-by: Nikita Sobolev --- Lib/doctest.py | 6 +++++- Lib/test/test_doctest.py | 18 ++++++++++++++++++ ...3-11-04-10-24-25.gh-issue-111541.x0RBI1.rst | 1 + 3 files changed, 24 insertions(+), 1 deletion(-) create mode 100644 Misc/NEWS.d/next/Library/2023-11-04-10-24-25.gh-issue-111541.x0RBI1.rst diff --git a/Lib/doctest.py b/Lib/doctest.py index 783664a0d3554e..21abe475ef85f5 100644 --- a/Lib/doctest.py +++ b/Lib/doctest.py @@ -1376,10 +1376,14 @@ def __run(self, test, compileflags, out): # we don't care about the carets / suggestions / etc # We only care about the error message and notes. # They start with `SyntaxError:` (or any other class name) + exception_line_prefixes = ( + f"{exception[0].__qualname__}:", + f"{exception[0].__module__}.{exception[0].__qualname__}:", + ) exc_msg_index = next( index for index, line in enumerate(formatted_ex) - if line.startswith(f"{exception[0].__name__}:") + if line.startswith(exception_line_prefixes) ) formatted_ex = formatted_ex[exc_msg_index:] diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py index 98bda46d312909..617529dded3b73 100644 --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -3279,6 +3279,24 @@ def test_syntax_error_with_note(cls, multiline=False): raise exc +def test_syntax_error_subclass_from_stdlib(): + """ + `ParseError` is a subclass of `SyntaxError`, but it is not a builtin: + + >>> test_syntax_error_subclass_from_stdlib() + Traceback (most recent call last): + ... + xml.etree.ElementTree.ParseError: error + error + Note + Line + """ + from xml.etree.ElementTree import ParseError + exc = ParseError("error\nerror") + exc.add_note('Note\nLine') + raise exc + + def test_syntax_error_with_incorrect_expected_note(): """ >>> def f(x): diff --git a/Misc/NEWS.d/next/Library/2023-11-04-10-24-25.gh-issue-111541.x0RBI1.rst b/Misc/NEWS.d/next/Library/2023-11-04-10-24-25.gh-issue-111541.x0RBI1.rst new file mode 100644 index 00000000000000..719b63dad36fb7 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-11-04-10-24-25.gh-issue-111541.x0RBI1.rst @@ -0,0 +1 @@ +Fix :mod:`doctest` for :exc:`SyntaxError` not-builtin subclasses. From cebf2d2a5d751052d16b6234e78e0e902b72b706 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 4 Nov 2023 20:56:08 +0100 Subject: [PATCH 170/265] [3.11] gh-111724: Fix doctest `ResourceWarning` in `howto/descriptor.rst` (GH-111725) (#111728) gh-111724: Fix doctest `ResourceWarning` in `howto/descriptor.rst` (GH-111725) Close database connection explicitly in test cleanup. (cherry picked from commit f48e669504ce53040a04e0181064c11741a87817) Co-authored-by: Nikita Sobolev --- Doc/howto/descriptor.rst | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/Doc/howto/descriptor.rst b/Doc/howto/descriptor.rst index 1d9424cb735a46..0a7263614670a1 100644 --- a/Doc/howto/descriptor.rst +++ b/Doc/howto/descriptor.rst @@ -943,6 +943,10 @@ it can be updated: >>> Movie('Star Wars').director 'J.J. Abrams' +.. testcleanup:: + + conn.close() + Pure Python Equivalents ^^^^^^^^^^^^^^^^^^^^^^^ From 9fe9eaec2d094a65eb106de447d2f5c9fd1ab267 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 5 Nov 2023 05:28:48 +0100 Subject: [PATCH 171/265] [3.11] gh-111747: DOC: fix moved link to Documentation Translations (GH-111748) (#111750) Update old link in bugs.rst to the table of doc translators and translation repositories at Github. (cherry picked from commit 72e27a67b97993f277e69c9dafb063007ba79adf) Co-authored-by: partev --- Doc/bugs.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/bugs.rst b/Doc/bugs.rst index d98192b369603e..908987cf41ff6e 100644 --- a/Doc/bugs.rst +++ b/Doc/bugs.rst @@ -38,7 +38,7 @@ though it may take a while to be processed. `Helping with Documentation `_ Comprehensive guide for individuals that are interested in contributing to Python documentation. - `Documentation Translations `_ + `Documentation Translations `_ A list of GitHub pages for documentation translation and their primary contacts. From bd4cc4715f425e557b2c707dca21dbf00897cef3 Mon Sep 17 00:00:00 2001 From: Donghee Na Date: Mon, 6 Nov 2023 20:29:59 +0900 Subject: [PATCH 172/265] =?UTF-8?q?[3.11]=20gh-101180:=20Fix=20a=20bug=20w?= =?UTF-8?q?here=20iso2022=5Fjp=5F3=20and=20iso2022=5Fjp=5F2004=20co?= =?UTF-8?q?=E2=80=A6=20(gh-111771)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit [3.11] gh-101180: Fix a bug where iso2022_jp_3 and iso2022_jp_2004 codecs read out of bounds (gh-111695) (cherry picked from commit c8faa3568afd255708096f6aa8df0afa80cf7697) Co-authored-by: Masayuki Moriyama --- Lib/test/test_codecencodings_iso2022.py | 46 +++++++++++++++++++ ...-10-27-19-38-33.gh-issue-102388.vd5YUZ.rst | 1 + Modules/cjkcodecs/_codecs_iso2022.c | 9 ++-- 3 files changed, 53 insertions(+), 3 deletions(-) create mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-27-19-38-33.gh-issue-102388.vd5YUZ.rst diff --git a/Lib/test/test_codecencodings_iso2022.py b/Lib/test/test_codecencodings_iso2022.py index 00ea1c39dd6fb6..027dbecc6134df 100644 --- a/Lib/test/test_codecencodings_iso2022.py +++ b/Lib/test/test_codecencodings_iso2022.py @@ -24,6 +24,52 @@ class Test_ISO2022_JP2(multibytecodec_support.TestBase, unittest.TestCase): (b'ab\x1BNdef', 'replace', 'abdef'), ) +class Test_ISO2022_JP3(multibytecodec_support.TestBase, unittest.TestCase): + encoding = 'iso2022_jp_3' + tstring = multibytecodec_support.load_teststring('iso2022_jp') + codectests = COMMON_CODEC_TESTS + ( + (b'ab\x1BNdef', 'replace', 'ab\x1BNdef'), + (b'\x1B$(O\x2E\x23\x1B(B', 'strict', '\u3402' ), + (b'\x1B$(O\x2E\x22\x1B(B', 'strict', '\U0002000B' ), + (b'\x1B$(O\x24\x77\x1B(B', 'strict', '\u304B\u309A'), + (b'\x1B$(P\x21\x22\x1B(B', 'strict', '\u4E02' ), + (b'\x1B$(P\x7E\x76\x1B(B', 'strict', '\U0002A6B2' ), + ('\u3402', 'strict', b'\x1B$(O\x2E\x23\x1B(B'), + ('\U0002000B', 'strict', b'\x1B$(O\x2E\x22\x1B(B'), + ('\u304B\u309A', 'strict', b'\x1B$(O\x24\x77\x1B(B'), + ('\u4E02', 'strict', b'\x1B$(P\x21\x22\x1B(B'), + ('\U0002A6B2', 'strict', b'\x1B$(P\x7E\x76\x1B(B'), + (b'ab\x1B$(O\x2E\x21\x1B(Bdef', 'replace', 'ab\uFFFDdef'), + ('ab\u4FF1def', 'replace', b'ab?def'), + ) + xmlcharnametest = ( + '\xAB\u211C\xBB = \u2329\u1234\u232A', + b'\x1B$(O\x29\x28\x1B(Bℜ\x1B$(O\x29\x32\x1B(B = ⟨ሴ⟩' + ) + +class Test_ISO2022_JP2004(multibytecodec_support.TestBase, unittest.TestCase): + encoding = 'iso2022_jp_2004' + tstring = multibytecodec_support.load_teststring('iso2022_jp') + codectests = COMMON_CODEC_TESTS + ( + (b'ab\x1BNdef', 'replace', 'ab\x1BNdef'), + (b'\x1B$(Q\x2E\x23\x1B(B', 'strict', '\u3402' ), + (b'\x1B$(Q\x2E\x22\x1B(B', 'strict', '\U0002000B' ), + (b'\x1B$(Q\x24\x77\x1B(B', 'strict', '\u304B\u309A'), + (b'\x1B$(P\x21\x22\x1B(B', 'strict', '\u4E02' ), + (b'\x1B$(P\x7E\x76\x1B(B', 'strict', '\U0002A6B2' ), + ('\u3402', 'strict', b'\x1B$(Q\x2E\x23\x1B(B'), + ('\U0002000B', 'strict', b'\x1B$(Q\x2E\x22\x1B(B'), + ('\u304B\u309A', 'strict', b'\x1B$(Q\x24\x77\x1B(B'), + ('\u4E02', 'strict', b'\x1B$(P\x21\x22\x1B(B'), + ('\U0002A6B2', 'strict', b'\x1B$(P\x7E\x76\x1B(B'), + (b'ab\x1B$(Q\x2E\x21\x1B(Bdef', 'replace', 'ab\u4FF1def'), + ('ab\u4FF1def', 'replace', b'ab\x1B$(Q\x2E\x21\x1B(Bdef'), + ) + xmlcharnametest = ( + '\xAB\u211C\xBB = \u2329\u1234\u232A', + b'\x1B$(Q\x29\x28\x1B(Bℜ\x1B$(Q\x29\x32\x1B(B = ⟨ሴ⟩' + ) + class Test_ISO2022_KR(multibytecodec_support.TestBase, unittest.TestCase): encoding = 'iso2022_kr' tstring = multibytecodec_support.load_teststring('iso2022_kr') diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-27-19-38-33.gh-issue-102388.vd5YUZ.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-27-19-38-33.gh-issue-102388.vd5YUZ.rst new file mode 100644 index 00000000000000..268a3d310f2b49 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2023-10-27-19-38-33.gh-issue-102388.vd5YUZ.rst @@ -0,0 +1 @@ +Fix a bug where ``iso2022_jp_3`` and ``iso2022_jp_2004`` codecs read out of bounds diff --git a/Modules/cjkcodecs/_codecs_iso2022.c b/Modules/cjkcodecs/_codecs_iso2022.c index 7394cf67e0e7dd..6d906ecdd396c2 100644 --- a/Modules/cjkcodecs/_codecs_iso2022.c +++ b/Modules/cjkcodecs/_codecs_iso2022.c @@ -181,8 +181,9 @@ ENCODER(iso2022) encoded = MAP_UNMAPPABLE; for (dsg = CONFIG_DESIGNATIONS; dsg->mark; dsg++) { + Py_UCS4 buf[2] = {c, 0}; Py_ssize_t length = 1; - encoded = dsg->encoder(&c, &length); + encoded = dsg->encoder(buf, &length); if (encoded == MAP_MULTIPLE_AVAIL) { /* this implementation won't work for pair * of non-bmp characters. */ @@ -191,9 +192,11 @@ ENCODER(iso2022) return MBERR_TOOFEW; length = -1; } - else + else { + buf[1] = INCHAR2; length = 2; - encoded = dsg->encoder(&c, &length); + } + encoded = dsg->encoder(buf, &length); if (encoded != MAP_UNMAPPABLE) { insize = length; break; From 4edf25565f20550024f98bd2a561d3ee15cc3aac Mon Sep 17 00:00:00 2001 From: Ethan Furman Date: Mon, 6 Nov 2023 18:57:10 -0800 Subject: [PATCH 173/265] [3.11] gh-111797: fix enum how-to (GH-111805) remove extra error line in how-to --- Doc/howto/enum.rst | 1 - 1 file changed, 1 deletion(-) diff --git a/Doc/howto/enum.rst b/Doc/howto/enum.rst index a714f6d9aa3c7e..3deb071144633b 100644 --- a/Doc/howto/enum.rst +++ b/Doc/howto/enum.rst @@ -1401,7 +1401,6 @@ alias:: Traceback (most recent call last): ... ValueError: aliases not allowed in DuplicateFreeEnum: 'GRENE' --> 'GREEN' - Error calling __set_name__ on '_proto_member' instance 'GRENE' in 'Color' .. note:: From 942ef7bde4ad22f1cc73bfc69fe0540395e75405 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 7 Nov 2023 23:04:18 +0100 Subject: [PATCH 174/265] [3.11] gh-111806: Fix `test_recursion` in `test_richcmp` on WASI builds (GH-111830) (GH-111832) gh-111806: Fix `test_recursion` in `test_richcmp` on WASI builds (GH-111830) (cherry picked from commit f115a55f0e455a4b43a1da9fd838a60a101f182a) Co-authored-by: Nikita Sobolev --- Lib/test/test_richcmp.py | 1 + 1 file changed, 1 insertion(+) diff --git a/Lib/test/test_richcmp.py b/Lib/test/test_richcmp.py index 58729a9fea62fa..5f449cdc05c6ba 100644 --- a/Lib/test/test_richcmp.py +++ b/Lib/test/test_richcmp.py @@ -221,6 +221,7 @@ def do(bad): self.assertRaises(Exc, func, Bad()) @support.no_tracing + @support.infinite_recursion(25) def test_recursion(self): # Check that comparison for recursive objects fails gracefully from collections import UserList From 45f17a165bb20e472cc050442d556598b4a0b5d3 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 8 Nov 2023 15:47:54 +0100 Subject: [PATCH 175/265] [3.11] Glossary: Add "static type checker" (GH-111837) (#111855) Glossary: Add "static type checker" (GH-111837) (cherry picked from commit 8ab7ad63086b1793c24b1c5aaa19b60fc0e6540e) Co-authored-by: Jelle Zijlstra Co-authored-by: Alex Waygood --- Doc/glossary.rst | 9 +++++++-- Doc/howto/pyporting.rst | 5 +++-- Doc/library/datetime.rst | 3 ++- Doc/library/typing.rst | 4 ++-- 4 files changed, 14 insertions(+), 7 deletions(-) diff --git a/Doc/glossary.rst b/Doc/glossary.rst index f318a37b654e9d..47a95250f661be 100644 --- a/Doc/glossary.rst +++ b/Doc/glossary.rst @@ -1130,6 +1130,11 @@ Glossary an :term:`expression` or one of several constructs with a keyword, such as :keyword:`if`, :keyword:`while` or :keyword:`for`. + static type checker + An external tool that reads Python code and analyzes it, looking for + issues such as incorrect types. See also :term:`type hints ` + and the :mod:`typing` module. + strong reference In Python's C API, a strong reference is a reference to an object which is owned by the code holding the reference. The strong @@ -1206,8 +1211,8 @@ Glossary attribute, or a function parameter or return value. Type hints are optional and are not enforced by Python but - they are useful to static type analysis tools, and aid IDEs with code - completion and refactoring. + they are useful to :term:`static type checkers `. + They can also aid IDEs with code completion and refactoring. Type hints of global variables, class attributes, and functions, but not local variables, can be accessed using diff --git a/Doc/howto/pyporting.rst b/Doc/howto/pyporting.rst index 6c30a0dd7d6bcc..501b16d82d4d6f 100644 --- a/Doc/howto/pyporting.rst +++ b/Doc/howto/pyporting.rst @@ -39,7 +39,8 @@ are: #. Once your dependencies are no longer blocking you, use continuous integration to make sure you stay compatible with Python 2 and 3 (tox_ can help test against multiple versions of Python; ``python -m pip install tox``) -#. Consider using optional static type checking to make sure your type usage +#. Consider using optional :term:`static type checking ` + to make sure your type usage works in both Python 2 and 3 (e.g. use mypy_ to check your typing under both Python 2 and Python 3; ``python -m pip install mypy``). @@ -395,7 +396,7 @@ comparisons occur, making the mistake much easier to track down. Consider using optional static type checking -------------------------------------------- -Another way to help port your code is to use a static type checker like +Another way to help port your code is to use a :term:`static type checker` like mypy_ or pytype_ on your code. These tools can be used to analyze your code as if it's being run under Python 2, then you can run the tool a second time as if your code is running under Python 3. By running a static type checker twice like diff --git a/Doc/library/datetime.rst b/Doc/library/datetime.rst index 1c2abc8cf4e30e..caab35acb94c88 100644 --- a/Doc/library/datetime.rst +++ b/Doc/library/datetime.rst @@ -38,7 +38,8 @@ on efficient attribute extraction for output formatting and manipulation. Third-party library with expanded time zone and parsing support. Package `DateType `_ - Third-party library that introduces distinct static types to e.g. allow static type checkers + Third-party library that introduces distinct static types to e.g. allow + :term:`static type checkers ` to differentiate between naive and aware datetimes. .. _datetime-naive-aware: diff --git a/Doc/library/typing.rst b/Doc/library/typing.rst index 0decbaf39749c0..f00a4ce20c8a51 100644 --- a/Doc/library/typing.rst +++ b/Doc/library/typing.rst @@ -18,8 +18,8 @@ .. note:: The Python runtime does not enforce function and variable type annotations. - They can be used by third party tools such as type checkers, IDEs, linters, - etc. + They can be used by third party tools such as :term:`type checkers `, + IDEs, linters, etc. -------------- From a60bbdee3a640620555c4ebe3942c0e0f8af6a26 Mon Sep 17 00:00:00 2001 From: Nikita Sobolev Date: Wed, 8 Nov 2023 20:19:47 +0300 Subject: [PATCH 176/265] [3.11] gh-108303: Move more typing related files to Lib/test/typinganndata (GH-111825) (#111860) --- Lib/test/test_typing.py | 3 +-- Lib/test/{ => typinganndata}/_typed_dict_helper.py | 0 Lib/test/{ => typinganndata}/mod_generics_cache.py | 0 3 files changed, 1 insertion(+), 2 deletions(-) rename Lib/test/{ => typinganndata}/_typed_dict_helper.py (100%) rename Lib/test/{ => typinganndata}/mod_generics_cache.py (100%) diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index fa279b557de2c6..30056e49ae29d9 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -44,8 +44,7 @@ import types from test.support import import_helper, captured_stderr, cpython_only -from test import mod_generics_cache -from test import _typed_dict_helper +from test.typinganndata import mod_generics_cache, _typed_dict_helper py_typing = import_helper.import_fresh_module('typing', blocked=['_typing']) diff --git a/Lib/test/_typed_dict_helper.py b/Lib/test/typinganndata/_typed_dict_helper.py similarity index 100% rename from Lib/test/_typed_dict_helper.py rename to Lib/test/typinganndata/_typed_dict_helper.py diff --git a/Lib/test/mod_generics_cache.py b/Lib/test/typinganndata/mod_generics_cache.py similarity index 100% rename from Lib/test/mod_generics_cache.py rename to Lib/test/typinganndata/mod_generics_cache.py From 409a0ac8e5e6505400df7a686e8aa1b694006cea Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 9 Nov 2023 15:43:28 +0100 Subject: [PATCH 177/265] [3.11] gh-108303: Move config parser data to `Lib/test/configparserdata/` (gh-111879) (gh-111883) gh-108303: Move config parser data to `Lib/test/configparserdata/` (gh-111879) (cherry picked from commit cc18b886a51672c59622837a2b8e83bf6be28c58) Co-authored-by: Nikita Sobolev --- Lib/idlelib/idle_test/test_config.py | 8 ++++---- Lib/test/{ => configdata}/cfgparser.1 | 0 Lib/test/{ => configdata}/cfgparser.2 | 0 Lib/test/{ => configdata}/cfgparser.3 | 0 Lib/test/test_configparser.py | 16 ++++++++-------- 5 files changed, 12 insertions(+), 12 deletions(-) rename Lib/test/{ => configdata}/cfgparser.1 (100%) rename Lib/test/{ => configdata}/cfgparser.2 (100%) rename Lib/test/{ => configdata}/cfgparser.3 (100%) diff --git a/Lib/idlelib/idle_test/test_config.py b/Lib/idlelib/idle_test/test_config.py index a746f1538a62b0..6d75cf7aa67dcc 100644 --- a/Lib/idlelib/idle_test/test_config.py +++ b/Lib/idlelib/idle_test/test_config.py @@ -85,8 +85,8 @@ def test_load_nothing(self): self.assertEqual(parser.sections(), []) def test_load_file(self): - # Borrow test/cfgparser.1 from test_configparser. - config_path = findfile('cfgparser.1') + # Borrow test/configdata/cfgparser.1 from test_configparser. + config_path = findfile('cfgparser.1', subdir='configdata') parser = config.IdleConfParser(config_path) parser.Load() @@ -294,8 +294,8 @@ def test_create_config_handlers(self): def test_load_cfg_files(self): conf = self.new_config(_utest=True) - # Borrow test/cfgparser.1 from test_configparser. - config_path = findfile('cfgparser.1') + # Borrow test/configdata/cfgparser.1 from test_configparser. + config_path = findfile('cfgparser.1', subdir='configdata') conf.defaultCfg['foo'] = config.IdleConfParser(config_path) conf.userCfg['foo'] = config.IdleUserConfParser(config_path) diff --git a/Lib/test/cfgparser.1 b/Lib/test/configdata/cfgparser.1 similarity index 100% rename from Lib/test/cfgparser.1 rename to Lib/test/configdata/cfgparser.1 diff --git a/Lib/test/cfgparser.2 b/Lib/test/configdata/cfgparser.2 similarity index 100% rename from Lib/test/cfgparser.2 rename to Lib/test/configdata/cfgparser.2 diff --git a/Lib/test/cfgparser.3 b/Lib/test/configdata/cfgparser.3 similarity index 100% rename from Lib/test/cfgparser.3 rename to Lib/test/configdata/cfgparser.3 diff --git a/Lib/test/test_configparser.py b/Lib/test/test_configparser.py index 59c4b275cb46d7..f08dad06aba38d 100644 --- a/Lib/test/test_configparser.py +++ b/Lib/test/test_configparser.py @@ -544,7 +544,7 @@ def test_parse_errors(self): "[Foo]\n wrong-indent\n") self.assertEqual(e.args, ('',)) # read_file on a real file - tricky = support.findfile("cfgparser.3") + tricky = support.findfile("cfgparser.3", subdir="configdata") if self.delimiters[0] == '=': error = configparser.ParsingError expected = (tricky,) @@ -718,7 +718,7 @@ class mystr(str): def test_read_returns_file_list(self): if self.delimiters[0] != '=': self.skipTest('incompatible format') - file1 = support.findfile("cfgparser.1") + file1 = support.findfile("cfgparser.1", subdir="configdata") # check when we pass a mix of readable and non-readable files: cf = self.newconfig() parsed_files = cf.read([file1, "nonexistent-file"], encoding="utf-8") @@ -751,7 +751,7 @@ def test_read_returns_file_list(self): def test_read_returns_file_list_with_bytestring_path(self): if self.delimiters[0] != '=': self.skipTest('incompatible format') - file1_bytestring = support.findfile("cfgparser.1").encode() + file1_bytestring = support.findfile("cfgparser.1", subdir="configdata").encode() # check when passing an existing bytestring path cf = self.newconfig() parsed_files = cf.read(file1_bytestring, encoding="utf-8") @@ -1161,7 +1161,7 @@ class RawConfigParserTestSambaConf(CfgParserTestCaseClass, unittest.TestCase): empty_lines_in_values = False def test_reading(self): - smbconf = support.findfile("cfgparser.2") + smbconf = support.findfile("cfgparser.2", subdir="configdata") # check when we pass a mix of readable and non-readable files: cf = self.newconfig() parsed_files = cf.read([smbconf, "nonexistent-file"], encoding='utf-8') @@ -1356,7 +1356,7 @@ class ConfigParserTestCaseTrickyFile(CfgParserTestCaseClass, unittest.TestCase): allow_no_value = True def test_cfgparser_dot_3(self): - tricky = support.findfile("cfgparser.3") + tricky = support.findfile("cfgparser.3", subdir="configdata") cf = self.newconfig() self.assertEqual(len(cf.read(tricky, encoding='utf-8')), 1) self.assertEqual(cf.sections(), ['strange', @@ -1388,7 +1388,7 @@ def test_cfgparser_dot_3(self): self.assertEqual(cf.get('more interpolation', 'lets'), 'go shopping') def test_unicode_failure(self): - tricky = support.findfile("cfgparser.3") + tricky = support.findfile("cfgparser.3", subdir="configdata") cf = self.newconfig() with self.assertRaises(UnicodeDecodeError): cf.read(tricky, encoding='ascii') @@ -1489,7 +1489,7 @@ def fromstring(self, string, defaults=None): class FakeFile: def __init__(self): - file_path = support.findfile("cfgparser.1") + file_path = support.findfile("cfgparser.1", subdir="configdata") with open(file_path, encoding="utf-8") as f: self.lines = f.readlines() self.lines.reverse() @@ -1510,7 +1510,7 @@ def readline_generator(f): class ReadFileTestCase(unittest.TestCase): def test_file(self): - file_paths = [support.findfile("cfgparser.1")] + file_paths = [support.findfile("cfgparser.1", subdir="configdata")] try: file_paths.append(file_paths[0].encode('utf8')) except UnicodeEncodeError: From 3e7e39c9777c21ea32e3deb29f6e5c96e416bb35 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 9 Nov 2023 16:10:26 +0100 Subject: [PATCH 178/265] [3.11] gh-108303: Move more files to `Lib/test/test_module` (GH-111880) (#111892) gh-108303: Move more files to `Lib/test/test_module` (GH-111880) (cherry picked from commit 0c42f7304a2757fe0f78bc6c6fbb33225cd9da15) Co-authored-by: Nikita Sobolev --- Lib/test/test_module/__init__.py | 2 +- Lib/test/{ => test_module}/final_a.py | 4 ++-- Lib/test/{ => test_module}/final_b.py | 4 ++-- 3 files changed, 5 insertions(+), 5 deletions(-) rename Lib/test/{ => test_module}/final_a.py (79%) rename Lib/test/{ => test_module}/final_b.py (79%) diff --git a/Lib/test/test_module/__init__.py b/Lib/test/test_module/__init__.py index c2bca45bb60c44..b470dfc6c9b2bd 100644 --- a/Lib/test/test_module/__init__.py +++ b/Lib/test/test_module/__init__.py @@ -268,7 +268,7 @@ def test_module_repr_source(self): def test_module_finalization_at_shutdown(self): # Module globals and builtins should still be available during shutdown - rc, out, err = assert_python_ok("-c", "from test import final_a") + rc, out, err = assert_python_ok("-c", "from test.test_module import final_a") self.assertFalse(err) lines = out.splitlines() self.assertEqual(set(lines), { diff --git a/Lib/test/final_a.py b/Lib/test/test_module/final_a.py similarity index 79% rename from Lib/test/final_a.py rename to Lib/test/test_module/final_a.py index 390ee8895a8a9e..a983f3111248e0 100644 --- a/Lib/test/final_a.py +++ b/Lib/test/test_module/final_a.py @@ -3,7 +3,7 @@ """ import shutil -import test.final_b +import test.test_module.final_b x = 'a' @@ -11,7 +11,7 @@ class C: def __del__(self): # Inspect module globals and builtins print("x =", x) - print("final_b.x =", test.final_b.x) + print("final_b.x =", test.test_module.final_b.x) print("shutil.rmtree =", getattr(shutil.rmtree, '__name__', None)) print("len =", getattr(len, '__name__', None)) diff --git a/Lib/test/final_b.py b/Lib/test/test_module/final_b.py similarity index 79% rename from Lib/test/final_b.py rename to Lib/test/test_module/final_b.py index 7228d82b880156..f3e8d5594904f2 100644 --- a/Lib/test/final_b.py +++ b/Lib/test/test_module/final_b.py @@ -3,7 +3,7 @@ """ import shutil -import test.final_a +import test.test_module.final_a x = 'b' @@ -11,7 +11,7 @@ class C: def __del__(self): # Inspect module globals and builtins print("x =", x) - print("final_a.x =", test.final_a.x) + print("final_a.x =", test.test_module.final_a.x) print("shutil.rmtree =", getattr(shutil.rmtree, '__name__', None)) print("len =", getattr(len, '__name__', None)) From 63205e5692041c59283b8fa667310d12fd5846ee Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 9 Nov 2023 16:37:04 +0100 Subject: [PATCH 179/265] [3.11] gh-111881: Import doctest lazily in libregrtest (GH-111884) (#111894) gh-111881: Import doctest lazily in libregrtest (GH-111884) In most cases, doctest is not needed. So don't always import it at startup. The change reduces the number of modules already imported when a test is run. (cherry picked from commit 6f09f69b7f85962f66d10637c3325bbb2b2d9853) Co-authored-by: Victor Stinner --- Lib/test/libregrtest/single.py | 13 ++++++++----- 1 file changed, 8 insertions(+), 5 deletions(-) diff --git a/Lib/test/libregrtest/single.py b/Lib/test/libregrtest/single.py index ad75ef54a8c3f8..5c7bc7d40fb394 100644 --- a/Lib/test/libregrtest/single.py +++ b/Lib/test/libregrtest/single.py @@ -1,4 +1,3 @@ -import doctest import faulthandler import gc import importlib @@ -99,14 +98,18 @@ def regrtest_runner(result: TestResult, test_func, runtests: RunTests) -> None: stats = test_result case unittest.TestResult(): stats = TestStats.from_unittest(test_result) - case doctest.TestResults(): - stats = TestStats.from_doctest(test_result) case None: print_warning(f"{result.test_name} test runner returned None: {test_func}") stats = None case _: - print_warning(f"Unknown test result type: {type(test_result)}") - stats = None + # Don't import doctest at top level since only few tests return + # a doctest.TestResult instance. + import doctest + if isinstance(test_result, doctest.TestResults): + stats = TestStats.from_doctest(test_result) + else: + print_warning(f"Unknown test result type: {type(test_result)}") + stats = None result.stats = stats From f108976ebbfe28bc5c9a62d95555c7adc396b82f Mon Sep 17 00:00:00 2001 From: Victor Stinner Date: Thu, 9 Nov 2023 17:46:27 +0100 Subject: [PATCH 180/265] [3.11] gh-111881: Use lazy import in test.support (#111885) (#111890) (#111902) [3.12] gh-111881: Use lazy import in test.support (#111885) (#111890) gh-111881: Use lazy import in test.support (#111885) * Import lazily getpass in test.support Backport to 3.11: test.support.os_helper is unchanged. (cherry picked from commit 0372e3b02a7e3dc1c564dba94dcd817c5472b04f) (cherry picked from commit e983ca859de279682631dbb13b959f14a7d89a7b) --- Lib/test/support/__init__.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Lib/test/support/__init__.py b/Lib/test/support/__init__.py index 7ebc42e5bdd01c..dc7a6e6741b8bb 100644 --- a/Lib/test/support/__init__.py +++ b/Lib/test/support/__init__.py @@ -6,7 +6,6 @@ import contextlib import dataclasses import functools -import getpass import os import re import stat @@ -380,6 +379,7 @@ def wrapper(*args, **kw): def skip_if_buildbot(reason=None): """Decorator raising SkipTest if running on a buildbot.""" + import getpass if not reason: reason = 'not suitable for buildbots' try: From 316221c23ff1b8acfe5c51298373464191c7d47f Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 9 Nov 2023 20:13:43 +0100 Subject: [PATCH 181/265] [3.11] gh-111895: Convert definition list to bullet list for readability on mobile (GH-111898) (#111909) gh-111895: Convert definition list to bullet list for readability on mobile (GH-111898) Convert definition list to bullet list for readability on mobile (cherry picked from commit 7d21e3d5ee9858aee570aa6c5b6a6e87d776f4b5) Co-authored-by: Hugo van Kemenade --- Doc/howto/enum.rst | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/Doc/howto/enum.rst b/Doc/howto/enum.rst index 3deb071144633b..a533b5f1020a1b 100644 --- a/Doc/howto/enum.rst +++ b/Doc/howto/enum.rst @@ -573,9 +573,9 @@ The complete signature is:: start=1, ) -:value: What the new enum class will record as its name. +* *value*: What the new enum class will record as its name. -:names: The enum members. This can be a whitespace- or comma-separated string +* *names*: The enum members. This can be a whitespace- or comma-separated string (values will start at 1 unless otherwise specified):: 'RED GREEN BLUE' | 'RED,GREEN,BLUE' | 'RED, GREEN, BLUE' @@ -592,13 +592,13 @@ The complete signature is:: {'CHARTREUSE': 7, 'SEA_GREEN': 11, 'ROSEMARY': 42} -:module: name of module where new enum class can be found. +* *module*: name of module where new enum class can be found. -:qualname: where in module new enum class can be found. +* *qualname*: where in module new enum class can be found. -:type: type to mix in to new enum class. +* *type*: type to mix in to new enum class. -:start: number to start counting at if only names are passed in. +* *start*: number to start counting at if only names are passed in. .. versionchanged:: 3.5 The *start* parameter was added. From 078cdccc2ae343ed66cfc044c8150575f8d6cf94 Mon Sep 17 00:00:00 2001 From: Vinay Sajip Date: Thu, 9 Nov 2023 19:51:43 +0000 Subject: [PATCH 182/265] [3.11] gh-110875: Handle '.' properties in logging formatter configuration correctly. (GH-110943) (GH-111914) Co-authored-by: Vinay Sajip --- Lib/logging/config.py | 12 ++++++------ Lib/test/test_logging.py | 42 ++++++++++++++++++++++++++++++++++++++-- 2 files changed, 46 insertions(+), 8 deletions(-) diff --git a/Lib/logging/config.py b/Lib/logging/config.py index 7e78a64a2f08a6..735ffbe1946780 100644 --- a/Lib/logging/config.py +++ b/Lib/logging/config.py @@ -1,4 +1,4 @@ -# Copyright 2001-2019 by Vinay Sajip. All Rights Reserved. +# Copyright 2001-2023 by Vinay Sajip. All Rights Reserved. # # Permission to use, copy, modify, and distribute this software and its # documentation for any purpose and without fee is hereby granted, @@ -19,7 +19,7 @@ is based on PEP 282 and comments thereto in comp.lang.python, and influenced by Apache's log4j system. -Copyright (C) 2001-2019 Vinay Sajip. All Rights Reserved. +Copyright (C) 2001-2023 Vinay Sajip. All Rights Reserved. To use, simply 'import logging' and log away! """ @@ -477,10 +477,10 @@ def configure_custom(self, config): c = config.pop('()') if not callable(c): c = self.resolve(c) - props = config.pop('.', None) # Check for valid identifiers - kwargs = {k: config[k] for k in config if valid_ident(k)} + kwargs = {k: config[k] for k in config if (k != '.' and valid_ident(k))} result = c(**kwargs) + props = config.pop('.', None) if props: for name, value in props.items(): setattr(result, name, value) @@ -752,8 +752,7 @@ def configure_handler(self, config): 'address' in config: config['address'] = self.as_tuple(config['address']) factory = klass - props = config.pop('.', None) - kwargs = {k: config[k] for k in config if valid_ident(k)} + kwargs = {k: config[k] for k in config if (k != '.' and valid_ident(k))} try: result = factory(**kwargs) except TypeError as te: @@ -771,6 +770,7 @@ def configure_handler(self, config): result.setLevel(logging._checkLevel(level)) if filters: self.add_filters(result, filters) + props = config.pop('.', None) if props: for name, value in props.items(): setattr(result, name, value) diff --git a/Lib/test/test_logging.py b/Lib/test/test_logging.py index e097acd85eb72b..757cdc72bbc89c 100644 --- a/Lib/test/test_logging.py +++ b/Lib/test/test_logging.py @@ -1,4 +1,4 @@ -# Copyright 2001-2022 by Vinay Sajip. All Rights Reserved. +# Copyright 2001-2023 by Vinay Sajip. All Rights Reserved. # # Permission to use, copy, modify, and distribute this software and its # documentation for any purpose and without fee is hereby granted, @@ -16,7 +16,7 @@ """Test harness for the logging module. Run all tests. -Copyright (C) 2001-2022 Vinay Sajip. All Rights Reserved. +Copyright (C) 2001-2023 Vinay Sajip. All Rights Reserved. """ import logging @@ -2881,6 +2881,39 @@ class ConfigDictTest(BaseTest): }, } + class CustomFormatter(logging.Formatter): + custom_property = "." + + def format(self, record): + return super().format(record) + + config17 = { + 'version': 1, + 'formatters': { + "custom": { + "()": CustomFormatter, + "style": "{", + "datefmt": "%Y-%m-%d %H:%M:%S", + "format": "{message}", # <-- to force an exception when configuring + ".": { + "custom_property": "value" + } + } + }, + 'handlers' : { + 'hand1' : { + 'class' : 'logging.StreamHandler', + 'formatter' : 'custom', + 'level' : 'NOTSET', + 'stream' : 'ext://sys.stdout', + }, + }, + 'root' : { + 'level' : 'WARNING', + 'handlers' : ['hand1'], + }, + } + out_of_order = { "version": 1, "formatters": { @@ -3295,6 +3328,11 @@ def cleanup(h1, fn): handler = logging.root.handlers[0] self.addCleanup(cleanup, handler, fn) + def test_config17_ok(self): + self.apply_config(self.config17) + h = logging._handlers['hand1'] + self.assertEqual(h.formatter.custom_property, 'value') + def setup_via_listener(self, text, verify=None): text = text.encode("utf-8") # Ask for a randomly assigned port (by using port 0) From 226f4bc69234cb64e52c027e548d4a6dc7f8f0c5 Mon Sep 17 00:00:00 2001 From: Brett Cannon Date: Thu, 9 Nov 2023 15:36:15 -0800 Subject: [PATCH 183/265] [3.11] GH-111804: Drop posix.fallocate() under WASI (GH-111869) (GH-111920) GH-111804: Drop posix.fallocate() under WASI (GH-111869) Drop posix.fallocate() under WASI. The underlying POSIX function, posix_fallocate(), was found to vary too much between implementations to remain in WASI. As such, while it was available in WASI preview1, it's been dropped in preview2. --- .../Library/2023-11-08-15-58-57.gh-issue-111804.uAXTOL.rst | 2 ++ Modules/clinic/posixmodule.c.h | 6 +++--- Modules/posixmodule.c | 7 +++++-- 3 files changed, 10 insertions(+), 5 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-11-08-15-58-57.gh-issue-111804.uAXTOL.rst diff --git a/Misc/NEWS.d/next/Library/2023-11-08-15-58-57.gh-issue-111804.uAXTOL.rst b/Misc/NEWS.d/next/Library/2023-11-08-15-58-57.gh-issue-111804.uAXTOL.rst new file mode 100644 index 00000000000000..2696f2f492a8b0 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-11-08-15-58-57.gh-issue-111804.uAXTOL.rst @@ -0,0 +1,2 @@ +Remove posix.fallocate() under WASI as the underlying posix_fallocate() is +not available in WASI preview2. diff --git a/Modules/clinic/posixmodule.c.h b/Modules/clinic/posixmodule.c.h index e5efd1a3a547ab..8d5a5fecbb2287 100644 --- a/Modules/clinic/posixmodule.c.h +++ b/Modules/clinic/posixmodule.c.h @@ -6259,7 +6259,7 @@ os_truncate(PyObject *module, PyObject *const *args, Py_ssize_t nargs, PyObject #endif /* (defined HAVE_TRUNCATE || defined MS_WINDOWS) */ -#if (defined(HAVE_POSIX_FALLOCATE) && !defined(POSIX_FADVISE_AIX_BUG)) +#if (defined(HAVE_POSIX_FALLOCATE) && !defined(POSIX_FADVISE_AIX_BUG) && !defined(__wasi__)) PyDoc_STRVAR(os_posix_fallocate__doc__, "posix_fallocate($module, fd, offset, length, /)\n" @@ -6304,7 +6304,7 @@ os_posix_fallocate(PyObject *module, PyObject *const *args, Py_ssize_t nargs) return return_value; } -#endif /* (defined(HAVE_POSIX_FALLOCATE) && !defined(POSIX_FADVISE_AIX_BUG)) */ +#endif /* (defined(HAVE_POSIX_FALLOCATE) && !defined(POSIX_FADVISE_AIX_BUG) && !defined(__wasi__)) */ #if (defined(HAVE_POSIX_FADVISE) && !defined(POSIX_FADVISE_AIX_BUG)) @@ -9388,4 +9388,4 @@ os_waitstatus_to_exitcode(PyObject *module, PyObject *const *args, Py_ssize_t na #ifndef OS_WAITSTATUS_TO_EXITCODE_METHODDEF #define OS_WAITSTATUS_TO_EXITCODE_METHODDEF #endif /* !defined(OS_WAITSTATUS_TO_EXITCODE_METHODDEF) */ -/*[clinic end generated code: output=8277545a0dea1505 input=a9049054013a1b77]*/ +/*[clinic end generated code: output=b649ad9a4e1f2427 input=a9049054013a1b77]*/ diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c index 2e4a66ab1bb64a..18e983e5aa1d99 100644 --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -10987,7 +10987,10 @@ os_truncate_impl(PyObject *module, path_t *path, Py_off_t length) #endif -#if defined(HAVE_POSIX_FALLOCATE) && !defined(POSIX_FADVISE_AIX_BUG) +/* GH-111804: Due to posix_fallocate() not having consistent semantics across + OSs, support was dropped in WASI preview2. */ +#if defined(HAVE_POSIX_FALLOCATE) && !defined(POSIX_FADVISE_AIX_BUG) && \ + !defined(__wasi__) /*[clinic input] os.posix_fallocate @@ -11025,7 +11028,7 @@ os_posix_fallocate_impl(PyObject *module, int fd, Py_off_t offset, errno = result; return posix_error(); } -#endif /* HAVE_POSIX_FALLOCATE) && !POSIX_FADVISE_AIX_BUG */ +#endif /* HAVE_POSIX_FALLOCATE) && !POSIX_FADVISE_AIX_BUG && !defined(__wasi__) */ #if defined(HAVE_POSIX_FADVISE) && !defined(POSIX_FADVISE_AIX_BUG) From 0e45786a6aed3d70ad346fd95272858da01464b7 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 10 Nov 2023 08:32:41 +0100 Subject: [PATCH 184/265] gh-111356: io: Add missing documented objects to io.__all__ (GH-111370) Add DEFAULT_BUFFER_SIZE, text_encoding, and IncrementalNewlineDecoder. (cherry picked from commit baeb7718f8981319c5cb1fbdd46d162ded7964ea) Co-authored-by: Nicolas Tessore --- Lib/io.py | 3 ++- Lib/test/test_io.py | 24 +++++++++++-------- ...-10-30-08-50-46.gh-issue-111356.Bc8LvA.rst | 1 + 3 files changed, 17 insertions(+), 11 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-10-30-08-50-46.gh-issue-111356.Bc8LvA.rst diff --git a/Lib/io.py b/Lib/io.py index a4186499c64715..a58ab01ede3a70 100644 --- a/Lib/io.py +++ b/Lib/io.py @@ -45,7 +45,8 @@ "FileIO", "BytesIO", "StringIO", "BufferedIOBase", "BufferedReader", "BufferedWriter", "BufferedRWPair", "BufferedRandom", "TextIOBase", "TextIOWrapper", - "UnsupportedOperation", "SEEK_SET", "SEEK_CUR", "SEEK_END"] + "UnsupportedOperation", "SEEK_SET", "SEEK_CUR", "SEEK_END", + "DEFAULT_BUFFER_SIZE", "text_encoding", "IncrementalNewlineDecoder"] import _io diff --git a/Lib/test/test_io.py b/Lib/test/test_io.py index e54a13c2d84bde..0f1d526b0e7b3b 100644 --- a/Lib/test/test_io.py +++ b/Lib/test/test_io.py @@ -3949,19 +3949,18 @@ class PyIncrementalNewlineDecoderTest(IncrementalNewlineDecoderTest): class MiscIOTest(unittest.TestCase): + # for test__all__, actual values are set in subclasses + name_of_module = None + extra_exported = () + not_exported = () + def tearDown(self): os_helper.unlink(os_helper.TESTFN) def test___all__(self): - for name in self.io.__all__: - obj = getattr(self.io, name, None) - self.assertIsNotNone(obj, name) - if name in ("open", "open_code"): - continue - elif "error" in name.lower() or name == "UnsupportedOperation": - self.assertTrue(issubclass(obj, Exception), name) - elif not name.startswith("SEEK_"): - self.assertTrue(issubclass(obj, self.IOBase)) + support.check__all__(self, self.io, self.name_of_module, + extra=self.extra_exported, + not_exported=self.not_exported) def test_attributes(self): f = self.open(os_helper.TESTFN, "wb", buffering=0) @@ -4328,6 +4327,8 @@ def test_openwrapper(self): class CMiscIOTest(MiscIOTest): io = io + name_of_module = "io", "_io" + extra_exported = "BlockingIOError", def test_readinto_buffer_overflow(self): # Issue #18025 @@ -4392,6 +4393,9 @@ def test_daemon_threads_shutdown_stderr_deadlock(self): class PyMiscIOTest(MiscIOTest): io = pyio + name_of_module = "_pyio", "io" + extra_exported = "BlockingIOError", "open_code", + not_exported = "valid_seek_flags", @unittest.skipIf(os.name == 'nt', 'POSIX signals required for this test.') @@ -4679,7 +4683,7 @@ def load_tests(loader, tests, pattern): mocks = (MockRawIO, MisbehavedRawIO, MockFileIO, CloseFailureIO, MockNonBlockWriterIO, MockUnseekableIO, MockRawIOWithoutRead, SlowFlushRawIO) - all_members = io.__all__ + ["IncrementalNewlineDecoder"] + all_members = io.__all__ c_io_ns = {name : getattr(io, name) for name in all_members} py_io_ns = {name : getattr(pyio, name) for name in all_members} globs = globals() diff --git a/Misc/NEWS.d/next/Library/2023-10-30-08-50-46.gh-issue-111356.Bc8LvA.rst b/Misc/NEWS.d/next/Library/2023-10-30-08-50-46.gh-issue-111356.Bc8LvA.rst new file mode 100644 index 00000000000000..a821b52b982175 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-10-30-08-50-46.gh-issue-111356.Bc8LvA.rst @@ -0,0 +1 @@ +Added :func:`io.text_encoding()`, :data:`io.DEFAULT_BUFFER_SIZE`, and :class:`io.IncrementalNewlineDecoder` to ``io.__all__``. From ecc8df2ee0f32de542df8ea26d0c0533c98ba677 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 10 Nov 2023 10:54:38 +0100 Subject: [PATCH 185/265] [3.11] gh-111929: Fix regrtest --pgo: test_str => test_unicode (GH-111938) (#111940) gh-111929: Fix regrtest --pgo: test_str => test_unicode (GH-111938) test_unicode was renamed to test_str in Python 3.13, but Python 3.12 still uses test_unicode name. (cherry picked from commit 5f42a2bc4017f2ed023f9cf19fdbffabd57527f5) Co-authored-by: Victor Stinner --- Lib/test/libregrtest/pgo.py | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Lib/test/libregrtest/pgo.py b/Lib/test/libregrtest/pgo.py index e3a6927be5db1d..cabbba73d5eff5 100644 --- a/Lib/test/libregrtest/pgo.py +++ b/Lib/test/libregrtest/pgo.py @@ -42,10 +42,10 @@ 'test_set', 'test_sqlite3', 'test_statistics', - 'test_str', 'test_struct', 'test_tabnanny', 'test_time', + 'test_unicode', 'test_xml_etree', 'test_xml_etree_c', ] From 7b8445109d7dd8b319f5239ca013e83938579b4c Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 10 Nov 2023 13:23:57 +0100 Subject: [PATCH 186/265] [3.11] gh-108303: Install `Lib/test/configdata` (GH-111899) (#111945) gh-108303: Install `Lib/test/configdata` (GH-111899) (cherry picked from commit 65d6dc27156112ac6a9f722b7b62529c94e0344b) Co-authored-by: Nikita Sobolev --- Makefile.pre.in | 1 + 1 file changed, 1 insertion(+) diff --git a/Makefile.pre.in b/Makefile.pre.in index ada866d83c0423..21aecd4161adb2 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -1951,6 +1951,7 @@ TESTSUBDIRS= ctypes/test \ test/certdata/capath \ test/cjkencodings \ test/crashers \ + test/configdata \ test/data \ test/decimaltestdata \ test/dtracedata \ From 8222ad01fe5e802649f0a1bd30d26d4623533c51 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 10 Nov 2023 14:32:02 +0100 Subject: [PATCH 187/265] [3.11] [3.12] gh-109181: Fix refleak in tb_get_lineno() (GH-111948) (#111951) [3.12] gh-109181: Fix refleak in tb_get_lineno() (GH-111948) PyFrame_GetCode() returns a strong reference. (cherry picked from commit 4b0c875d91727440251a8427a80d8515e39d18cd) Co-authored-by: Victor Stinner --- Python/traceback.c | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/Python/traceback.c b/Python/traceback.c index 664a73fdfbaa31..722f007459dc3b 100644 --- a/Python/traceback.c +++ b/Python/traceback.c @@ -115,7 +115,10 @@ static int tb_get_lineno(PyTracebackObject* tb) { PyFrameObject* frame = tb->tb_frame; assert(frame != NULL); - return PyCode_Addr2Line(PyFrame_GetCode(frame), tb->tb_lasti); + PyCodeObject *code = PyFrame_GetCode(frame); + int lineno = PyCode_Addr2Line(code, tb->tb_lasti); + Py_DECREF(code); + return lineno; } static PyObject * From e433785e489285b2f9c77731c799294309c2cec9 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 10 Nov 2023 15:12:24 +0100 Subject: [PATCH 188/265] [3.11] gh-111912: Run test_posix on Windows (GH-111913) (GH-111954) (cherry picked from commit 64fea3211d08082236d05c38ee728f922eb7d8ed) Co-authored-by: Serhiy Storchaka --- Lib/test/test_posix.py | 10 +++++++--- 1 file changed, 7 insertions(+), 3 deletions(-) diff --git a/Lib/test/test_posix.py b/Lib/test/test_posix.py index e4956b55a07e0a..912f07ec4436b5 100644 --- a/Lib/test/test_posix.py +++ b/Lib/test/test_posix.py @@ -6,9 +6,6 @@ from test.support import warnings_helper from test.support.script_helper import assert_python_ok -# Skip these tests if there is no posix module. -posix = import_helper.import_module('posix') - import errno import sys import signal @@ -22,6 +19,11 @@ import textwrap from contextlib import contextmanager +try: + import posix +except ImportError: + import nt as posix + try: import pwd except ImportError: @@ -1009,6 +1011,7 @@ def test_environ(self): self.assertEqual(type(k), item_type) self.assertEqual(type(v), item_type) + @unittest.skipUnless(os.name == 'posix', "see bug gh-111841") def test_putenv(self): with self.assertRaises(ValueError): os.putenv('FRUIT\0VEGETABLE', 'cabbage') @@ -1220,6 +1223,7 @@ def test_sched_setaffinity(self): self.assertRaises(OSError, posix.sched_setaffinity, -1, mask) @unittest.skipIf(support.is_wasi, "No dynamic linking on WASI") + @unittest.skipUnless(os.name == 'posix', "POSIX-only test") def test_rtld_constants(self): # check presence of major RTLD_* constants posix.RTLD_LAZY From 7b55a955bc4066c5db4c842b3d7319df487fb750 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 10 Nov 2023 15:43:51 +0100 Subject: [PATCH 189/265] [3.11] gh-111251: Fix error checking in _blake2 module init (GH-111252) (#111298) Co-authored-by: Nikita Sobolev Co-authored-by: Hugo van Kemenade --- ...-10-24-12-09-46.gh-issue-111251.urFYtn.rst | 1 + Modules/_blake2/blake2module.c | 25 +++++++++++++------ 2 files changed, 18 insertions(+), 8 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-10-24-12-09-46.gh-issue-111251.urFYtn.rst diff --git a/Misc/NEWS.d/next/Library/2023-10-24-12-09-46.gh-issue-111251.urFYtn.rst b/Misc/NEWS.d/next/Library/2023-10-24-12-09-46.gh-issue-111251.urFYtn.rst new file mode 100644 index 00000000000000..3a87cb25da5cb4 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-10-24-12-09-46.gh-issue-111251.urFYtn.rst @@ -0,0 +1 @@ +Fix :mod:`_blake2` not checking for errors when initializing. diff --git a/Modules/_blake2/blake2module.c b/Modules/_blake2/blake2module.c index 44d783b40d0453..93478f5c8bda0d 100644 --- a/Modules/_blake2/blake2module.c +++ b/Modules/_blake2/blake2module.c @@ -74,6 +74,12 @@ _blake2_free(void *module) Py_DECREF(x); \ } while(0) +#define ADD_INT_CONST(NAME, VALUE) do { \ + if (PyModule_AddIntConstant(m, NAME, VALUE) < 0) { \ + return -1; \ + } \ +} while (0) + static int blake2_exec(PyObject *m) { @@ -95,10 +101,10 @@ blake2_exec(PyObject *m) ADD_INT(d, "MAX_KEY_SIZE", BLAKE2B_KEYBYTES); ADD_INT(d, "MAX_DIGEST_SIZE", BLAKE2B_OUTBYTES); - PyModule_AddIntConstant(m, "BLAKE2B_SALT_SIZE", BLAKE2B_SALTBYTES); - PyModule_AddIntConstant(m, "BLAKE2B_PERSON_SIZE", BLAKE2B_PERSONALBYTES); - PyModule_AddIntConstant(m, "BLAKE2B_MAX_KEY_SIZE", BLAKE2B_KEYBYTES); - PyModule_AddIntConstant(m, "BLAKE2B_MAX_DIGEST_SIZE", BLAKE2B_OUTBYTES); + ADD_INT_CONST("BLAKE2B_SALT_SIZE", BLAKE2B_SALTBYTES); + ADD_INT_CONST("BLAKE2B_PERSON_SIZE", BLAKE2B_PERSONALBYTES); + ADD_INT_CONST("BLAKE2B_MAX_KEY_SIZE", BLAKE2B_KEYBYTES); + ADD_INT_CONST("BLAKE2B_MAX_DIGEST_SIZE", BLAKE2B_OUTBYTES); /* BLAKE2s */ st->blake2s_type = (PyTypeObject *)PyType_FromModuleAndSpec( @@ -117,14 +123,17 @@ blake2_exec(PyObject *m) ADD_INT(d, "MAX_KEY_SIZE", BLAKE2S_KEYBYTES); ADD_INT(d, "MAX_DIGEST_SIZE", BLAKE2S_OUTBYTES); - PyModule_AddIntConstant(m, "BLAKE2S_SALT_SIZE", BLAKE2S_SALTBYTES); - PyModule_AddIntConstant(m, "BLAKE2S_PERSON_SIZE", BLAKE2S_PERSONALBYTES); - PyModule_AddIntConstant(m, "BLAKE2S_MAX_KEY_SIZE", BLAKE2S_KEYBYTES); - PyModule_AddIntConstant(m, "BLAKE2S_MAX_DIGEST_SIZE", BLAKE2S_OUTBYTES); + ADD_INT_CONST("BLAKE2S_SALT_SIZE", BLAKE2S_SALTBYTES); + ADD_INT_CONST("BLAKE2S_PERSON_SIZE", BLAKE2S_PERSONALBYTES); + ADD_INT_CONST("BLAKE2S_MAX_KEY_SIZE", BLAKE2S_KEYBYTES); + ADD_INT_CONST("BLAKE2S_MAX_DIGEST_SIZE", BLAKE2S_OUTBYTES); return 0; } +#undef ADD_INT +#undef ADD_INT_CONST + static PyModuleDef_Slot _blake2_slots[] = { {Py_mod_exec, blake2_exec}, {0, NULL} From cd3e2d3a6ca4bfcc8c99eb50e92a2ce0b5b1ddf6 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 11 Nov 2023 10:17:29 +0100 Subject: [PATCH 190/265] [3.11] gh-111841: Fix os.putenv() and os.unsetenv() with embedded NUL on Windows (GH-111842) (GH-111967) (cherry picked from commit 0b06d2482d77e02c5d40e221f6046c9c355458b2) Co-authored-by: Serhiy Storchaka --- Lib/test/test_os.py | 5 ++++- Lib/test/test_posix.py | 14 +++++++------- .../2023-11-08-11-50-49.gh-issue-111841.iSqdQf.rst | 2 ++ Modules/posixmodule.c | 7 ++++++- 4 files changed, 19 insertions(+), 9 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-11-08-11-50-49.gh-issue-111841.iSqdQf.rst diff --git a/Lib/test/test_os.py b/Lib/test/test_os.py index ff1121efdf74da..441e1f977a55d3 100644 --- a/Lib/test/test_os.py +++ b/Lib/test/test_os.py @@ -1130,9 +1130,12 @@ def test_putenv_unsetenv(self): def test_putenv_unsetenv_error(self): # Empty variable name is invalid. # "=" and null character are not allowed in a variable name. - for name in ('', '=name', 'na=me', 'name=', 'name\0', 'na\0me'): + for name in ('', '=name', 'na=me', 'name='): self.assertRaises((OSError, ValueError), os.putenv, name, "value") self.assertRaises((OSError, ValueError), os.unsetenv, name) + for name in ('name\0', 'na\0me'): + self.assertRaises(ValueError, os.putenv, name, "value") + self.assertRaises(ValueError, os.unsetenv, name) if sys.platform == "win32": # On Windows, an environment variable string ("name=value" string) diff --git a/Lib/test/test_posix.py b/Lib/test/test_posix.py index 912f07ec4436b5..0b1068bfff6ff9 100644 --- a/Lib/test/test_posix.py +++ b/Lib/test/test_posix.py @@ -1011,20 +1011,20 @@ def test_environ(self): self.assertEqual(type(k), item_type) self.assertEqual(type(v), item_type) - @unittest.skipUnless(os.name == 'posix', "see bug gh-111841") def test_putenv(self): with self.assertRaises(ValueError): os.putenv('FRUIT\0VEGETABLE', 'cabbage') - with self.assertRaises(ValueError): - os.putenv(b'FRUIT\0VEGETABLE', b'cabbage') with self.assertRaises(ValueError): os.putenv('FRUIT', 'orange\0VEGETABLE=cabbage') - with self.assertRaises(ValueError): - os.putenv(b'FRUIT', b'orange\0VEGETABLE=cabbage') with self.assertRaises(ValueError): os.putenv('FRUIT=ORANGE', 'lemon') - with self.assertRaises(ValueError): - os.putenv(b'FRUIT=ORANGE', b'lemon') + if os.name == 'posix': + with self.assertRaises(ValueError): + os.putenv(b'FRUIT\0VEGETABLE', b'cabbage') + with self.assertRaises(ValueError): + os.putenv(b'FRUIT', b'orange\0VEGETABLE=cabbage') + with self.assertRaises(ValueError): + os.putenv(b'FRUIT=ORANGE', b'lemon') @unittest.skipUnless(hasattr(posix, 'getcwd'), 'test needs posix.getcwd()') def test_getcwd_long_pathnames(self): diff --git a/Misc/NEWS.d/next/Library/2023-11-08-11-50-49.gh-issue-111841.iSqdQf.rst b/Misc/NEWS.d/next/Library/2023-11-08-11-50-49.gh-issue-111841.iSqdQf.rst new file mode 100644 index 00000000000000..cd1780988aeac7 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-11-08-11-50-49.gh-issue-111841.iSqdQf.rst @@ -0,0 +1,2 @@ +Fix truncating arguments on an embedded null character in :meth:`os.putenv` +and :meth:`os.unsetenv` on Windows. diff --git a/Modules/posixmodule.c b/Modules/posixmodule.c index 18e983e5aa1d99..9ac021f6c5f8b0 100644 --- a/Modules/posixmodule.c +++ b/Modules/posixmodule.c @@ -11102,7 +11102,6 @@ win32_putenv(PyObject *name, PyObject *value) } Py_ssize_t size; - /* PyUnicode_AsWideCharString() rejects embedded null characters */ wchar_t *env = PyUnicode_AsWideCharString(unicode, &size); Py_DECREF(unicode); @@ -11116,6 +11115,12 @@ win32_putenv(PyObject *name, PyObject *value) PyMem_Free(env); return NULL; } + if (wcslen(env) != (size_t)size) { + PyErr_SetString(PyExc_ValueError, + "embedded null character"); + PyMem_Free(env); + return NULL; + } /* _wputenv() and SetEnvironmentVariableW() update the environment in the Process Environment Block (PEB). _wputenv() also updates CRT 'environ' From 5a6d1dbc10878ff2d04b2e0687409855787312ff Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 12 Nov 2023 01:24:02 +0100 Subject: [PATCH 191/265] [3.11] Fix undefined behaviour in datetime.time.fromisoformat() (GH-111982) (#111991) Fix undefined behaviour in datetime.time.fromisoformat() (GH-111982) Fix undefined behaviour in datetime.time.fromisoformat() when parsing a string without a timezone. 'tzoffset' is not assigned to by parse_isoformat_time if it returns 0, but time_fromisoformat then passes tzoffset to another function, which is undefined behaviour (even if the function in question does not use the value). (cherry picked from commit 21615f77b5a580e83589abae618dbe7c298700e2) Co-authored-by: T. Wouters --- Modules/_datetimemodule.c | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Modules/_datetimemodule.c b/Modules/_datetimemodule.c index 5c67f38ec2111b..df0ce722d59af8 100644 --- a/Modules/_datetimemodule.c +++ b/Modules/_datetimemodule.c @@ -4649,7 +4649,7 @@ time_fromisoformat(PyObject *cls, PyObject *tstr) { } int hour = 0, minute = 0, second = 0, microsecond = 0; - int tzoffset, tzimicrosecond = 0; + int tzoffset = 0, tzimicrosecond = 0; int rv = parse_isoformat_time(p, len, &hour, &minute, &second, µsecond, &tzoffset, &tzimicrosecond); From 9536fe1656a8be775172bf8d14751ad6e3ebbd0a Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 12 Nov 2023 13:49:59 +0100 Subject: [PATCH 192/265] [3.11] gh-112001: Fix test_builtins_have_signatures in test_inspect (GH-112002) (GH-112004) (cherry picked from commit 40752c1c1e8cec80e99a2c9796f4fde2f8b5d3e2) Co-authored-by: Serhiy Storchaka --- Lib/test/test_inspect/test_inspect.py | 19 +++++++------------ 1 file changed, 7 insertions(+), 12 deletions(-) diff --git a/Lib/test/test_inspect/test_inspect.py b/Lib/test/test_inspect/test_inspect.py index 6d4050a5ded100..f6d95bd26304fb 100644 --- a/Lib/test/test_inspect/test_inspect.py +++ b/Lib/test/test_inspect/test_inspect.py @@ -4285,19 +4285,14 @@ def test_builtins_have_signatures(self): # These have unrepresentable parameter default values of NULL needs_null = {"anext"} no_signature |= needs_null - # These need PEP 457 groups or a signature change to accept None - needs_semantic_update = {"round"} - no_signature |= needs_semantic_update # These need *args support in Argument Clinic - needs_varargs = {"breakpoint", "min", "max", "print", - "__build_class__"} + needs_varargs = {"breakpoint", "min", "max", "__build_class__"} no_signature |= needs_varargs - # These simply weren't covered in the initial AC conversion - # for builtin callables - not_converted_yet = {"open", "__import__"} - no_signature |= not_converted_yet # These builtin types are expected to provide introspection info - types_with_signatures = set() + types_with_signatures = { + 'complex', 'enumerate', 'float', 'list', 'memoryview', 'object', + 'property', 'reversed', 'tuple', + } # Check the signatures we expect to be there ns = vars(builtins) for name, obj in sorted(ns.items()): @@ -4316,9 +4311,9 @@ def test_builtins_have_signatures(self): # This ensures this test will start failing as more signatures are # added, so the affected items can be moved into the scope of the # regression test above - for name in no_signature: + for name in no_signature - needs_null: with self.subTest(builtin=name): - self.assertIsNone(obj.__text_signature__) + self.assertIsNone(ns[name].__text_signature__) def test_python_function_override_signature(self): def func(*args, **kwargs): From aa3232db23e0dabbabb0a1f8fc9bb20bf7ec68c9 Mon Sep 17 00:00:00 2001 From: Victor Stinner Date: Mon, 13 Nov 2023 00:00:38 +0100 Subject: [PATCH 193/265] [3.11] gh-111929: Fix regrtest clear_caches() (#111949) gh-111929: Fix regrtest clear_caches() Python 3.11 doesn't have the fractions._hash_algorithm.cache_clear() function. --- Lib/test/libregrtest/utils.py | 7 ------- 1 file changed, 7 deletions(-) diff --git a/Lib/test/libregrtest/utils.py b/Lib/test/libregrtest/utils.py index f6de62af1f0ed5..25f3e853137316 100644 --- a/Lib/test/libregrtest/utils.py +++ b/Lib/test/libregrtest/utils.py @@ -272,13 +272,6 @@ def clear_caches(): for f in typing._cleanups: f() - try: - fractions = sys.modules['fractions'] - except KeyError: - pass - else: - fractions._hash_algorithm.cache_clear() - def get_build_info(): # Get most important configure and build options as a list of strings. From b64afbc55c27274a73ade0680606cde2a26470cb Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 13 Nov 2023 01:14:03 +0100 Subject: [PATCH 194/265] [3.11] gh-111944: Add assignment expression parentheses requirements (GH-111977) (#112011) Augment the list of places where parentheses are required around assignnment statements. In particular, 'a := 0' and 'a = b := 1' are syntax errors. (cherry picked from commit 9a2f25d374f027f6509484d66e1c7bba03977b99) Co-authored-by: Terry Jan Reedy --- Doc/reference/expressions.rst | 9 +++++---- 1 file changed, 5 insertions(+), 4 deletions(-) diff --git a/Doc/reference/expressions.rst b/Doc/reference/expressions.rst index 5cba4a6934f05b..9207fd3f3c02ee 100644 --- a/Doc/reference/expressions.rst +++ b/Doc/reference/expressions.rst @@ -1764,10 +1764,11 @@ Or, when processing a file stream in chunks: while chunk := file.read(9000): process(chunk) -Assignment expressions must be surrounded by parentheses when used -as sub-expressions in slicing, conditional, lambda, -keyword-argument, and comprehension-if expressions -and in ``assert`` and ``with`` statements. +Assignment expressions must be surrounded by parentheses when +used as expression statements and when used as sub-expressions in +slicing, conditional, lambda, +keyword-argument, and comprehension-if expressions and +in ``assert``, ``with``, and ``assignment`` statements. In all other places where they can be used, parentheses are not required, including in ``if`` and ``while`` statements. From d4217e5db570b72774e5254173dd0718eb3c30bd Mon Sep 17 00:00:00 2001 From: Hugo van Kemenade Date: Mon, 13 Nov 2023 10:58:43 +0200 Subject: [PATCH 195/265] [3.11] Docs: Add `make htmllive` to rebuild and reload HTML files in your browser (GH-111900) (#112023) Co-authored-by: Hugo van Kemenade --- Doc/Makefile | 35 +++++++++++++++++++++++++++++++---- Doc/requirements.txt | 1 + 2 files changed, 32 insertions(+), 4 deletions(-) diff --git a/Doc/Makefile b/Doc/Makefile index ca82353eb454f3..f01b8a4ff958d9 100644 --- a/Doc/Makefile +++ b/Doc/Makefile @@ -22,16 +22,14 @@ PAPEROPT_letter = -D latex_elements.papersize=letterpaper ALLSPHINXOPTS = -b $(BUILDER) -d build/doctrees $(PAPEROPT_$(PAPER)) -j $(JOBS) \ $(SPHINXOPTS) $(SPHINXERRORHANDLING) . build/$(BUILDER) $(SOURCES) -.PHONY: help build html htmlhelp latex text texinfo changes linkcheck \ - suspicious coverage doctest pydoc-topics htmlview clean dist check serve \ - autobuild-dev autobuild-stable venv - +.PHONY: help help: @echo "Please use \`make ' where is one of" @echo " clean to remove build files" @echo " venv to create a venv with necessary tools" @echo " html to make standalone HTML files" @echo " htmlview to open the index page built by the html target in your browser" + @echo " htmllive to rebuild and reload HTML files in your browser" @echo " htmlhelp to make HTML files and a HTML help project" @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter" @echo " text to make plain text files" @@ -46,6 +44,7 @@ help: @echo " suspicious to check for suspicious markup in output text" @echo " check to run a check for frequent markup errors" +.PHONY: build build: -mkdir -p build # Look first for a Misc/NEWS file (building from a source release tarball @@ -72,38 +71,46 @@ build: $(SPHINXBUILD) $(ALLSPHINXOPTS) @echo +.PHONY: html html: BUILDER = html html: build @echo "Build finished. The HTML pages are in build/html." +.PHONY: htmlhelp htmlhelp: BUILDER = htmlhelp htmlhelp: build @echo "Build finished; now you can run HTML Help Workshop with the" \ "build/htmlhelp/pydoc.hhp project file." +.PHONY: latex latex: BUILDER = latex latex: build @echo "Build finished; the LaTeX files are in build/latex." @echo "Run \`make all-pdf' or \`make all-ps' in that directory to" \ "run these through (pdf)latex." +.PHONY: text text: BUILDER = text text: build @echo "Build finished; the text files are in build/text." +.PHONY: texinfo texinfo: BUILDER = texinfo texinfo: build @echo "Build finished; the python.texi file is in build/texinfo." @echo "Run \`make info' in that directory to run it through makeinfo." +.PHONY: epub epub: BUILDER = epub epub: build @echo "Build finished; the epub files are in build/epub." +.PHONY: changes changes: BUILDER = changes changes: build @echo "The overview file is in build/changes." +.PHONY: linkcheck linkcheck: BUILDER = linkcheck linkcheck: @$(MAKE) build BUILDER=$(BUILDER) || { \ @@ -111,6 +118,7 @@ linkcheck: "or in build/$(BUILDER)/output.txt"; \ false; } +.PHONY: suspicious suspicious: BUILDER = suspicious suspicious: @$(MAKE) build BUILDER=$(BUILDER) || { \ @@ -123,10 +131,12 @@ suspicious: @echo "⚠ make check" @echo "⚠ instead." +.PHONY: coverage coverage: BUILDER = coverage coverage: build @echo "Coverage finished; see c.txt and python.txt in build/coverage" +.PHONY: doctest doctest: BUILDER = doctest doctest: @$(MAKE) build BUILDER=$(BUILDER) || { \ @@ -134,20 +144,30 @@ doctest: "results in build/doctest/output.txt"; \ false; } +.PHONY: pydoc-topics pydoc-topics: BUILDER = pydoc-topics pydoc-topics: build @echo "Building finished; now run this:" \ "cp build/pydoc-topics/topics.py ../Lib/pydoc_data/topics.py" +.PHONY: htmlview htmlview: html $(PYTHON) -c "import os, webbrowser; webbrowser.open('file://' + os.path.realpath('build/html/index.html'))" +.PHONY: htmllive +htmllive: SPHINXBUILD = $(VENVDIR)/bin/sphinx-autobuild +htmllive: SPHINXOPTS = --re-ignore="/venv/" +htmllive: html + +.PHONY: clean clean: clean-venv -rm -rf build/* +.PHONY: clean-venv clean-venv: rm -rf $(VENVDIR) +.PHONY: venv venv: @if [ -d $(VENVDIR) ] ; then \ echo "venv already exists."; \ @@ -159,6 +179,7 @@ venv: echo "The venv has been created in the $(VENVDIR) directory"; \ fi +.PHONY: dist dist: rm -rf dist mkdir -p dist @@ -213,10 +234,12 @@ dist: rm -r dist/python-$(DISTVERSION)-docs-texinfo rm dist/python-$(DISTVERSION)-docs-texinfo.tar +.PHONY: check check: venv $(VENVDIR)/bin/python3 -m pre_commit --version > /dev/null || $(VENVDIR)/bin/python3 -m pip install pre-commit $(VENVDIR)/bin/python3 -m pre_commit run --all-files +.PHONY: serve serve: @echo "The serve target was removed, use htmlview instead (see bpo-36329)" @@ -228,15 +251,18 @@ serve: # output files) # for development releases: always build +.PHONY: autobuild-dev autobuild-dev: make dist SPHINXOPTS='$(SPHINXOPTS) -Ea -A daily=1' # for quick rebuilds (HTML only) +.PHONY: autobuild-dev-html autobuild-dev-html: make html SPHINXOPTS='$(SPHINXOPTS) -Ea -A daily=1' # for stable releases: only build if not in pre-release stage (alpha, beta) # release candidate downloads are okay, since the stable tree can be in that stage +.PHONY: autobuild-stable autobuild-stable: @case $(DISTVERSION) in *[ab]*) \ echo "Not building; $(DISTVERSION) is not a release version."; \ @@ -244,6 +270,7 @@ autobuild-stable: esac @make autobuild-dev +.PHONY: autobuild-stable-html autobuild-stable-html: @case $(DISTVERSION) in *[ab]*) \ echo "Not building; $(DISTVERSION) is not a release version."; \ diff --git a/Doc/requirements.txt b/Doc/requirements.txt index df79ae63840471..d81fef288caed3 100644 --- a/Doc/requirements.txt +++ b/Doc/requirements.txt @@ -10,6 +10,7 @@ sphinx==4.5.0 blurb +sphinx-autobuild sphinxext-opengraph>=0.7.1 # The theme used by the documentation is stored separately, so we need From 4baf633c5322ce48f2540e2a41ccbe4a92d4025a Mon Sep 17 00:00:00 2001 From: Alex Waygood Date: Mon, 13 Nov 2023 14:01:17 +0000 Subject: [PATCH 196/265] [3.11] gh-111681: minor fix to a typing doctest (#111682) (#112037) --- Doc/library/typing.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/library/typing.rst b/Doc/library/typing.rst index f00a4ce20c8a51..0eda01a323e490 100644 --- a/Doc/library/typing.rst +++ b/Doc/library/typing.rst @@ -1785,7 +1785,7 @@ for creating generic types. .. doctest:: - >>> from typing import ParamSpec + >>> from typing import ParamSpec, get_origin >>> P = ParamSpec("P") >>> get_origin(P.args) is P True From 46081fe9b50230530c34b3594da04b279d52f102 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 13 Nov 2023 20:50:56 +0100 Subject: [PATCH 197/265] [3.11] gh-112007: Re-organize help utility intro message (GH-112017) (#112048) gh-112007: Re-organize help utility intro message (GH-112017) Most important: move how-to-quit sentence to the end and mention 'q'. Re-group the other sentences and improve some wording. --------- (cherry picked from commit b28bb130bbc2ad956828819967d83e06d30a65c5) Co-authored-by: Terry Jan Reedy --- Lib/pydoc.py | 24 +++++++++++++----------- 1 file changed, 13 insertions(+), 11 deletions(-) diff --git a/Lib/pydoc.py b/Lib/pydoc.py index 5e191bc1c4b2a6..14b99039b3bbda 100755 --- a/Lib/pydoc.py +++ b/Lib/pydoc.py @@ -2073,20 +2073,22 @@ def help(self, request, is_cli=False): self.output.write('\n') def intro(self): - self.output.write(''' -Welcome to Python {0}'s help utility! - -If this is your first time using Python, you should definitely check out -the tutorial on the internet at https://docs.python.org/{0}/tutorial/. + self.output.write('''\ +Welcome to Python {0}'s help utility! If this is your first time using +Python, you should definitely check out the tutorial at +https://docs.python.org/{0}/tutorial/. Enter the name of any module, keyword, or topic to get help on writing -Python programs and using Python modules. To quit this help utility and -return to the interpreter, just type "quit". +Python programs and using Python modules. To get a list of available +modules, keywords, symbols, or topics, enter "modules", "keywords", +"symbols", or "topics". + +Each module also comes with a one-line summary of what it does; to list +the modules whose name or summary contain a given string such as "spam", +enter "modules spam". -To get a list of available modules, keywords, symbols, or topics, type -"modules", "keywords", "symbols", or "topics". Each module also comes -with a one-line summary of what it does; to list the modules whose name -or summary contain a given string such as "spam", type "modules spam". +To quit this help utility and return to the interpreter, +enter "q" or "quit". '''.format('%d.%d' % sys.version_info[:2])) def list(self, items, columns=4, width=80): From e73216d32b10ffbf9d0ee82e8c646b91f745ed91 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 14 Nov 2023 13:25:38 +0100 Subject: [PATCH 198/265] [3.11] gh-110944: Move pty helper to test.support and add basic pdb completion test (GH-111826) (GH-112025) gh-110944: Move pty helper to test.support and add basic pdb completion test (GH-111826) (cherry picked from commit 1c7ed7e9ebc53290c831d7b610219fa737153a1b) Co-authored-by: Tian Gao --- Lib/test/support/pty_helper.py | 60 ++++++++++++++++++++++++++++++++++ Lib/test/test_pdb.py | 30 +++++++++++++++++ Lib/test/test_readline.py | 55 +------------------------------ 3 files changed, 91 insertions(+), 54 deletions(-) create mode 100644 Lib/test/support/pty_helper.py diff --git a/Lib/test/support/pty_helper.py b/Lib/test/support/pty_helper.py new file mode 100644 index 00000000000000..11037d22516448 --- /dev/null +++ b/Lib/test/support/pty_helper.py @@ -0,0 +1,60 @@ +""" +Helper to run a script in a pseudo-terminal. +""" +import os +import selectors +import subprocess +import sys +from contextlib import ExitStack +from errno import EIO + +from test.support.import_helper import import_module + +def run_pty(script, input=b"dummy input\r", env=None): + pty = import_module('pty') + output = bytearray() + [master, slave] = pty.openpty() + args = (sys.executable, '-c', script) + proc = subprocess.Popen(args, stdin=slave, stdout=slave, stderr=slave, env=env) + os.close(slave) + with ExitStack() as cleanup: + cleanup.enter_context(proc) + def terminate(proc): + try: + proc.terminate() + except ProcessLookupError: + # Workaround for Open/Net BSD bug (Issue 16762) + pass + cleanup.callback(terminate, proc) + cleanup.callback(os.close, master) + # Avoid using DefaultSelector and PollSelector. Kqueue() does not + # work with pseudo-terminals on OS X < 10.9 (Issue 20365) and Open + # BSD (Issue 20667). Poll() does not work with OS X 10.6 or 10.4 + # either (Issue 20472). Hopefully the file descriptor is low enough + # to use with select(). + sel = cleanup.enter_context(selectors.SelectSelector()) + sel.register(master, selectors.EVENT_READ | selectors.EVENT_WRITE) + os.set_blocking(master, False) + while True: + for [_, events] in sel.select(): + if events & selectors.EVENT_READ: + try: + chunk = os.read(master, 0x10000) + except OSError as err: + # Linux raises EIO when slave is closed (Issue 5380) + if err.errno != EIO: + raise + chunk = b"" + if not chunk: + return output + output.extend(chunk) + if events & selectors.EVENT_WRITE: + try: + input = input[os.write(master, input):] + except OSError as err: + # Apparently EIO means the slave was closed + if err.errno != EIO: + raise + input = b"" # Stop writing + if not input: + sel.modify(master, selectors.EVENT_READ) diff --git a/Lib/test/test_pdb.py b/Lib/test/test_pdb.py index 5f9bcc5af82ad6..7d4c686ac77faf 100644 --- a/Lib/test/test_pdb.py +++ b/Lib/test/test_pdb.py @@ -15,6 +15,8 @@ from io import StringIO from test import support from test.support import os_helper +from test.support.import_helper import import_module +from test.support.pty_helper import run_pty # This little helper class is essential for testing pdb under doctest. from test.test_doctest import _FakeInput from unittest.mock import patch @@ -2485,6 +2487,34 @@ def test_checkline_is_not_executable(self): self.assertFalse(db.checkline(os_helper.TESTFN, lineno)) +@support.requires_subprocess() +class PdbTestReadline(unittest.TestCase): + def setUpClass(): + # Ensure that the readline module is loaded + # If this fails, the test is skipped because SkipTest will be raised + readline = import_module('readline') + if readline.__doc__ and "libedit" in readline.__doc__: + raise unittest.SkipTest("libedit readline is not supported for pdb") + + def test_basic_completion(self): + script = textwrap.dedent(""" + import pdb; pdb.Pdb().set_trace() + # Concatenate strings so that the output doesn't appear in the source + print('hello' + '!') + """) + + # List everything starting with 'co', there should be multiple matches + # then add ntin and complete 'contin' to 'continue' + input = b"co\t\tntin\t\n" + + output = run_pty(script, input) + + self.assertIn(b'commands', output) + self.assertIn(b'condition', output) + self.assertIn(b'continue', output) + self.assertIn(b'hello!', output) + + def load_tests(loader, tests, pattern): from test import test_pdb tests.addTest(doctest.DocTestSuite(test_pdb)) diff --git a/Lib/test/test_readline.py b/Lib/test/test_readline.py index 59dbef90380053..835280f2281cde 100644 --- a/Lib/test/test_readline.py +++ b/Lib/test/test_readline.py @@ -1,18 +1,15 @@ """ Very minimal unittests for parts of the readline module. """ -from contextlib import ExitStack -from errno import EIO import locale import os -import selectors -import subprocess import sys import tempfile import unittest from test.support import verbose from test.support.import_helper import import_module from test.support.os_helper import unlink, temp_dir, TESTFN +from test.support.pty_helper import run_pty from test.support.script_helper import assert_python_ok # Skip tests if there is no readline module @@ -304,55 +301,5 @@ def test_history_size(self): self.assertEqual(lines[-1].strip(), b"last input") -def run_pty(script, input=b"dummy input\r", env=None): - pty = import_module('pty') - output = bytearray() - [master, slave] = pty.openpty() - args = (sys.executable, '-c', script) - proc = subprocess.Popen(args, stdin=slave, stdout=slave, stderr=slave, env=env) - os.close(slave) - with ExitStack() as cleanup: - cleanup.enter_context(proc) - def terminate(proc): - try: - proc.terminate() - except ProcessLookupError: - # Workaround for Open/Net BSD bug (Issue 16762) - pass - cleanup.callback(terminate, proc) - cleanup.callback(os.close, master) - # Avoid using DefaultSelector and PollSelector. Kqueue() does not - # work with pseudo-terminals on OS X < 10.9 (Issue 20365) and Open - # BSD (Issue 20667). Poll() does not work with OS X 10.6 or 10.4 - # either (Issue 20472). Hopefully the file descriptor is low enough - # to use with select(). - sel = cleanup.enter_context(selectors.SelectSelector()) - sel.register(master, selectors.EVENT_READ | selectors.EVENT_WRITE) - os.set_blocking(master, False) - while True: - for [_, events] in sel.select(): - if events & selectors.EVENT_READ: - try: - chunk = os.read(master, 0x10000) - except OSError as err: - # Linux raises EIO when slave is closed (Issue 5380) - if err.errno != EIO: - raise - chunk = b"" - if not chunk: - return output - output.extend(chunk) - if events & selectors.EVENT_WRITE: - try: - input = input[os.write(master, input):] - except OSError as err: - # Apparently EIO means the slave was closed - if err.errno != EIO: - raise - input = b"" # Stop writing - if not input: - sel.modify(master, selectors.EVENT_READ) - - if __name__ == "__main__": unittest.main() From a92b9e5b2b50638c1a01e0e7fe77c81148d74285 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 15 Nov 2023 06:20:17 +0100 Subject: [PATCH 199/265] [3.11] Docs: Add the time to the HTML last updated format (GH-110091) (#112103) Docs: Add the time to the HTML last updated format (GH-110091) (cherry picked from commit 6c214dea7c503eb42bd130d43e8880f39bff0350) Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com> --- Doc/conf.py | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/Doc/conf.py b/Doc/conf.py index 8c92799ecbbf3b..3f5ba5d969d313 100644 --- a/Doc/conf.py +++ b/Doc/conf.py @@ -280,9 +280,8 @@ "pr_id": os.getenv("READTHEDOCS_VERSION") } -# If not '', a 'Last updated on:' timestamp is inserted at every page bottom, -# using the given strftime format. -html_last_updated_fmt = '%b %d, %Y' +# This 'Last updated on:' timestamp is inserted at the bottom of every page. +html_last_updated_fmt = time.strftime('%b %d, %Y (%H:%M UTC)', time.gmtime()) # Path to find HTML templates. templates_path = ['tools/templates'] From e2421a36f0868e2c5eb492ca9e74ae3d7c37357e Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 15 Nov 2023 15:20:18 +0100 Subject: [PATCH 200/265] [3.11] gh-111942: Fix crashes in TextIOWrapper.reconfigure() (GH-111976) (GH-112059) * Fix crash when encoding is not string or None. * Fix crash when both line_buffering and write_through raise exception when converted ti int. * Add a number of tests for constructor and reconfigure() method with invalid arguments. (cherry picked from commit ee06fffd38cb51ce1c045da9d8336d9ce13c318a) Co-authored-by: Serhiy Storchaka --- Lib/test/test_io.py | 84 ++++++++++++++++++- ...-11-10-22-08-28.gh-issue-111942.MDFm6v.rst | 2 + ...-11-14-18-43-55.gh-issue-111942.x1pnrj.rst | 2 + Modules/_io/textio.c | 50 ++++++++++- 4 files changed, 132 insertions(+), 6 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-11-10-22-08-28.gh-issue-111942.MDFm6v.rst create mode 100644 Misc/NEWS.d/next/Library/2023-11-14-18-43-55.gh-issue-111942.x1pnrj.rst diff --git a/Lib/test/test_io.py b/Lib/test/test_io.py index 0f1d526b0e7b3b..78537057f32666 100644 --- a/Lib/test/test_io.py +++ b/Lib/test/test_io.py @@ -81,6 +81,10 @@ def _default_chunk_size(): ) +class BadIndex: + def __index__(self): + 1/0 + class MockRawIOWithoutRead: """A RawIO implementation without read(), so as to exercise the default RawIO.read() which calls readinto().""" @@ -2613,8 +2617,29 @@ def test_constructor(self): self.assertEqual(t.encoding, "utf-8") self.assertEqual(t.line_buffering, True) self.assertEqual("\xe9\n", t.readline()) - self.assertRaises(TypeError, t.__init__, b, encoding="utf-8", newline=42) - self.assertRaises(ValueError, t.__init__, b, encoding="utf-8", newline='xyzzy') + invalid_type = TypeError if self.is_C else ValueError + with self.assertRaises(invalid_type): + t.__init__(b, encoding=42) + with self.assertRaises(UnicodeEncodeError): + t.__init__(b, encoding='\udcfe') + with self.assertRaises(ValueError): + t.__init__(b, encoding='utf-8\0') + with self.assertRaises(invalid_type): + t.__init__(b, encoding="utf-8", errors=42) + if support.Py_DEBUG or sys.flags.dev_mode or self.is_C: + with self.assertRaises(UnicodeEncodeError): + t.__init__(b, encoding="utf-8", errors='\udcfe') + if support.Py_DEBUG or sys.flags.dev_mode or self.is_C: + with self.assertRaises(ValueError): + t.__init__(b, encoding="utf-8", errors='replace\0') + with self.assertRaises(TypeError): + t.__init__(b, encoding="utf-8", newline=42) + with self.assertRaises(ValueError): + t.__init__(b, encoding="utf-8", newline='\udcfe') + with self.assertRaises(ValueError): + t.__init__(b, encoding="utf-8", newline='\n\0') + with self.assertRaises(ValueError): + t.__init__(b, encoding="utf-8", newline='xyzzy') def test_uninitialized(self): t = self.TextIOWrapper.__new__(self.TextIOWrapper) @@ -3663,6 +3688,59 @@ def test_reconfigure_defaults(self): self.assertEqual(txt.detach().getvalue(), b'LF\nCRLF\r\n') + def test_reconfigure_errors(self): + txt = self.TextIOWrapper(self.BytesIO(), 'ascii', 'replace', '\r') + with self.assertRaises(TypeError): # there was a crash + txt.reconfigure(encoding=42) + if self.is_C: + with self.assertRaises(UnicodeEncodeError): + txt.reconfigure(encoding='\udcfe') + with self.assertRaises(LookupError): + txt.reconfigure(encoding='locale\0') + # TODO: txt.reconfigure(encoding='utf-8\0') + # TODO: txt.reconfigure(encoding='nonexisting') + with self.assertRaises(TypeError): + txt.reconfigure(errors=42) + if self.is_C: + with self.assertRaises(UnicodeEncodeError): + txt.reconfigure(errors='\udcfe') + # TODO: txt.reconfigure(errors='ignore\0') + # TODO: txt.reconfigure(errors='nonexisting') + with self.assertRaises(TypeError): + txt.reconfigure(newline=42) + with self.assertRaises(ValueError): + txt.reconfigure(newline='\udcfe') + with self.assertRaises(ValueError): + txt.reconfigure(newline='xyz') + if not self.is_C: + # TODO: Should fail in C too. + with self.assertRaises(ValueError): + txt.reconfigure(newline='\n\0') + if self.is_C: + # TODO: Use __bool__(), not __index__(). + with self.assertRaises(ZeroDivisionError): + txt.reconfigure(line_buffering=BadIndex()) + with self.assertRaises(OverflowError): + txt.reconfigure(line_buffering=2**1000) + with self.assertRaises(ZeroDivisionError): + txt.reconfigure(write_through=BadIndex()) + with self.assertRaises(OverflowError): + txt.reconfigure(write_through=2**1000) + with self.assertRaises(ZeroDivisionError): # there was a crash + txt.reconfigure(line_buffering=BadIndex(), + write_through=BadIndex()) + self.assertEqual(txt.encoding, 'ascii') + self.assertEqual(txt.errors, 'replace') + self.assertIs(txt.line_buffering, False) + self.assertIs(txt.write_through, False) + + txt.reconfigure(encoding='latin1', errors='ignore', newline='\r\n', + line_buffering=True, write_through=True) + self.assertEqual(txt.encoding, 'latin1') + self.assertEqual(txt.errors, 'ignore') + self.assertIs(txt.line_buffering, True) + self.assertIs(txt.write_through, True) + def test_reconfigure_newline(self): raw = self.BytesIO(b'CR\rEOF') txt = self.TextIOWrapper(raw, 'ascii', newline='\n') @@ -4693,9 +4771,11 @@ def load_tests(loader, tests, pattern): if test.__name__.startswith("C"): for name, obj in c_io_ns.items(): setattr(test, name, obj) + test.is_C = True elif test.__name__.startswith("Py"): for name, obj in py_io_ns.items(): setattr(test, name, obj) + test.is_C = False suite = loader.suiteClass() for test in tests: diff --git a/Misc/NEWS.d/next/Library/2023-11-10-22-08-28.gh-issue-111942.MDFm6v.rst b/Misc/NEWS.d/next/Library/2023-11-10-22-08-28.gh-issue-111942.MDFm6v.rst new file mode 100644 index 00000000000000..4fc505c8f257a6 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-11-10-22-08-28.gh-issue-111942.MDFm6v.rst @@ -0,0 +1,2 @@ +Fix crashes in :meth:`io.TextIOWrapper.reconfigure` when pass invalid +arguments, e.g. non-string encoding. diff --git a/Misc/NEWS.d/next/Library/2023-11-14-18-43-55.gh-issue-111942.x1pnrj.rst b/Misc/NEWS.d/next/Library/2023-11-14-18-43-55.gh-issue-111942.x1pnrj.rst new file mode 100644 index 00000000000000..ca58a6fa5d6ae1 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-11-14-18-43-55.gh-issue-111942.x1pnrj.rst @@ -0,0 +1,2 @@ +Fix SystemError in the TextIOWrapper constructor with non-encodable "errors" +argument in non-debug mode. diff --git a/Modules/_io/textio.c b/Modules/_io/textio.c index 7d5afb4c4441d6..b994dc3d7c45b1 100644 --- a/Modules/_io/textio.c +++ b/Modules/_io/textio.c @@ -1099,6 +1099,15 @@ _io_TextIOWrapper___init___impl(textio *self, PyObject *buffer, else if (io_check_errors(errors)) { return -1; } + Py_ssize_t errors_len; + const char *errors_str = PyUnicode_AsUTF8AndSize(errors, &errors_len); + if (errors_str == NULL) { + return -1; + } + if (strlen(errors_str) != (size_t)errors_len) { + PyErr_SetString(PyExc_ValueError, "embedded null character"); + return -1; + } if (validate_newline(newline) < 0) { return -1; @@ -1171,11 +1180,11 @@ _io_TextIOWrapper___init___impl(textio *self, PyObject *buffer, Py_INCREF(buffer); /* Build the decoder object */ - if (_textiowrapper_set_decoder(self, codec_info, PyUnicode_AsUTF8(errors)) != 0) + if (_textiowrapper_set_decoder(self, codec_info, errors_str) != 0) goto error; /* Build the encoder object */ - if (_textiowrapper_set_encoder(self, codec_info, PyUnicode_AsUTF8(errors)) != 0) + if (_textiowrapper_set_encoder(self, codec_info, errors_str) != 0) goto error; /* Finished sorting out the codec details */ @@ -1272,24 +1281,34 @@ textiowrapper_change_encoding(textio *self, PyObject *encoding, errors = &_Py_ID(strict); } } + Py_INCREF(errors); + const char *c_encoding = PyUnicode_AsUTF8(encoding); + if (c_encoding == NULL) { + Py_DECREF(encoding); + Py_DECREF(errors); + return -1; + } const char *c_errors = PyUnicode_AsUTF8(errors); if (c_errors == NULL) { Py_DECREF(encoding); + Py_DECREF(errors); return -1; } // Create new encoder & decoder PyObject *codec_info = _PyCodec_LookupTextEncoding( - PyUnicode_AsUTF8(encoding), "codecs.open()"); + c_encoding, "codecs.open()"); if (codec_info == NULL) { Py_DECREF(encoding); + Py_DECREF(errors); return -1; } if (_textiowrapper_set_decoder(self, codec_info, c_errors) != 0 || _textiowrapper_set_encoder(self, codec_info, c_errors) != 0) { Py_DECREF(codec_info); Py_DECREF(encoding); + Py_DECREF(errors); return -1; } Py_DECREF(codec_info); @@ -1327,6 +1346,26 @@ _io_TextIOWrapper_reconfigure_impl(textio *self, PyObject *encoding, int write_through; const char *newline = NULL; + if (encoding != Py_None && !PyUnicode_Check(encoding)) { + PyErr_Format(PyExc_TypeError, + "reconfigure() argument 'encoding' must be str or None, not %s", + Py_TYPE(encoding)->tp_name); + return NULL; + } + if (errors != Py_None && !PyUnicode_Check(errors)) { + PyErr_Format(PyExc_TypeError, + "reconfigure() argument 'errors' must be str or None, not %s", + Py_TYPE(errors)->tp_name); + return NULL; + } + if (newline_obj != NULL && newline_obj != Py_None && + !PyUnicode_Check(newline_obj)) + { + PyErr_Format(PyExc_TypeError, + "reconfigure() argument 'newline' must be str or None, not %s", + Py_TYPE(newline_obj)->tp_name); + return NULL; + } /* Check if something is in the read buffer */ if (self->decoded_chars != NULL) { if (encoding != Py_None || errors != Py_None || newline_obj != NULL) { @@ -1345,9 +1384,12 @@ _io_TextIOWrapper_reconfigure_impl(textio *self, PyObject *encoding, line_buffering = convert_optional_bool(line_buffering_obj, self->line_buffering); + if (line_buffering < 0) { + return NULL; + } write_through = convert_optional_bool(write_through_obj, self->write_through); - if (line_buffering < 0 || write_through < 0) { + if (write_through < 0) { return NULL; } From cdfef0c38317ca25841a0461a934ebc3a8b81686 Mon Sep 17 00:00:00 2001 From: Hugo van Kemenade Date: Thu, 16 Nov 2023 02:39:30 +0200 Subject: [PATCH 201/265] [3.11] gh-111062: CI: Move OS test jobs to reusable workflows (gh-111570) CI: Move OS test jobs to reusable workflows Co-authored-by: Donghee Na --- .github/workflows/build.yml | 161 +++---------------------- .github/workflows/reusable-macos.yml | 46 +++++++ .github/workflows/reusable-ubuntu.yml | 71 +++++++++++ .github/workflows/reusable-windows.yml | 53 ++++++++ 4 files changed, 184 insertions(+), 147 deletions(-) create mode 100644 .github/workflows/reusable-macos.yml create mode 100644 .github/workflows/reusable-ubuntu.yml create mode 100644 .github/workflows/reusable-windows.yml diff --git a/.github/workflows/build.yml b/.github/workflows/build.yml index 71901d4cab2ae5..ad9ec980a9540a 100644 --- a/.github/workflows/build.yml +++ b/.github/workflows/build.yml @@ -99,7 +99,7 @@ jobs: steps: - uses: actions/checkout@v4 - uses: actions/setup-python@v4 - - name: Install Dependencies + - name: Install dependencies run: | sudo ./.github/workflows/posix-deps-apt.sh sudo apt-get install -yq abigail-tools @@ -115,7 +115,7 @@ jobs: run: | if ! make check-abidump; then echo "Generated ABI file is not up to date." - echo "Please, add the release manager of this branch as a reviewer of this PR." + echo "Please add the release manager of this branch as a reviewer of this PR." echo "" echo "The up to date ABI file should be attached to this build as an artifact." echo "" @@ -194,159 +194,32 @@ jobs: - name: Check limited ABI symbols run: make check-limited-abi - build_win32: - name: 'Windows (x86)' - runs-on: windows-latest - timeout-minutes: 60 - needs: check_source - if: needs.check_source.outputs.run_tests == 'true' - env: - IncludeUwp: 'true' - steps: - - uses: actions/checkout@v4 - - name: Build CPython - run: .\PCbuild\build.bat -e -d -p Win32 - - name: Display build info - run: .\python.bat -m test.pythoninfo - - name: Tests - run: .\PCbuild\rt.bat -p Win32 -d -q -uall -u-cpu -rwW --slowest --timeout=1200 -j0 - - build_win_amd64: - name: 'Windows (x64)' - runs-on: windows-latest - timeout-minutes: 60 + build_windows: + name: 'Windows' needs: check_source if: needs.check_source.outputs.run_tests == 'true' - env: - IncludeUwp: 'true' - steps: - - uses: actions/checkout@v4 - - name: Register MSVC problem matcher - run: echo "::add-matcher::.github/problem-matchers/msvc.json" - - name: Build CPython - run: .\PCbuild\build.bat -e -d -p x64 - - name: Display build info - run: .\python.bat -m test.pythoninfo - - name: Tests - run: .\PCbuild\rt.bat -p x64 -d -q -uall -u-cpu -rwW --slowest --timeout=1200 -j0 - - build_win_arm64: - name: 'Windows (arm64)' - runs-on: windows-latest - timeout-minutes: 60 - needs: check_source - if: needs.check_source.outputs.run_tests == 'true' - env: - IncludeUwp: 'true' - steps: - - uses: actions/checkout@v4 - - name: Register MSVC problem matcher - run: echo "::add-matcher::.github/problem-matchers/msvc.json" - - name: Build CPython - run: .\PCbuild\build.bat -e -d -p arm64 + uses: ./.github/workflows/reusable-windows.yml build_macos: name: 'macOS' - runs-on: macos-latest - timeout-minutes: 60 needs: check_source if: needs.check_source.outputs.run_tests == 'true' - env: - HOMEBREW_NO_ANALYTICS: 1 - HOMEBREW_NO_AUTO_UPDATE: 1 - HOMEBREW_NO_INSTALL_CLEANUP: 1 - PYTHONSTRICTEXTENSIONBUILD: 1 - steps: - - uses: actions/checkout@v4 - - name: Restore config.cache - uses: actions/cache@v3 - with: - path: config.cache - key: ${{ github.job }}-${{ runner.os }}-${{ needs.check_source.outputs.config_hash }} - - name: Install Homebrew dependencies - run: brew install pkg-config openssl@3.0 xz gdbm tcl-tk - - name: Configure CPython - run: | - GDBM_CFLAGS="-I$(brew --prefix gdbm)/include" \ - GDBM_LIBS="-L$(brew --prefix gdbm)/lib -lgdbm" \ - ./configure \ - --config-cache \ - --with-pydebug \ - --prefix=/opt/python-dev \ - --with-openssl="$(brew --prefix openssl@3.0)" - - name: Build CPython - run: make -j4 - - name: Display build info - run: make pythoninfo - - name: Tests - run: make buildbottest TESTOPTS="-j4 -uall,-cpu" + uses: ./.github/workflows/reusable-macos.yml + with: + config_hash: ${{ needs.check_source.outputs.config_hash }} build_ubuntu: name: 'Ubuntu' - runs-on: ubuntu-20.04 - timeout-minutes: 60 needs: check_source if: needs.check_source.outputs.run_tests == 'true' - env: - OPENSSL_VER: 3.0.11 - PYTHONSTRICTEXTENSIONBUILD: 1 - steps: - - uses: actions/checkout@v4 - - name: Register gcc problem matcher - run: echo "::add-matcher::.github/problem-matchers/gcc.json" - - name: Install dependencies - run: sudo ./.github/workflows/posix-deps-apt.sh - - name: Configure OpenSSL env vars - run: | - echo "MULTISSL_DIR=${GITHUB_WORKSPACE}/multissl" >> $GITHUB_ENV - echo "OPENSSL_DIR=${GITHUB_WORKSPACE}/multissl/openssl/${OPENSSL_VER}" >> $GITHUB_ENV - echo "LD_LIBRARY_PATH=${GITHUB_WORKSPACE}/multissl/openssl/${OPENSSL_VER}/lib" >> $GITHUB_ENV - - name: 'Restore OpenSSL build' - id: cache-openssl - uses: actions/cache@v3 - with: - path: ./multissl/openssl/${{ env.OPENSSL_VER }} - key: ${{ runner.os }}-multissl-openssl-${{ env.OPENSSL_VER }} - - name: Install OpenSSL - if: steps.cache-openssl.outputs.cache-hit != 'true' - run: python3 Tools/ssl/multissltests.py --steps=library --base-directory $MULTISSL_DIR --openssl $OPENSSL_VER --system Linux - - name: Add ccache to PATH - run: | - echo "PATH=/usr/lib/ccache:$PATH" >> $GITHUB_ENV - - name: Configure ccache action - uses: hendrikmuhs/ccache-action@v1.2 - - name: Setup directory envs for out-of-tree builds - run: | - echo "CPYTHON_RO_SRCDIR=$(realpath -m ${GITHUB_WORKSPACE}/../cpython-ro-srcdir)" >> $GITHUB_ENV - echo "CPYTHON_BUILDDIR=$(realpath -m ${GITHUB_WORKSPACE}/../cpython-builddir)" >> $GITHUB_ENV - - name: Create directories for read-only out-of-tree builds - run: mkdir -p $CPYTHON_RO_SRCDIR $CPYTHON_BUILDDIR - - name: Bind mount sources read-only - run: sudo mount --bind -o ro $GITHUB_WORKSPACE $CPYTHON_RO_SRCDIR - - name: Restore config.cache - uses: actions/cache@v3 - with: - path: ${{ env.CPYTHON_BUILDDIR }}/config.cache - key: ${{ github.job }}-${{ runner.os }}-${{ needs.check_source.outputs.config_hash }} - - name: Configure CPython out-of-tree - working-directory: ${{ env.CPYTHON_BUILDDIR }} - run: | + uses: ./.github/workflows/reusable-ubuntu.yml + with: + config_hash: ${{ needs.check_source.outputs.config_hash }} + options: | ../cpython-ro-srcdir/configure \ --config-cache \ --with-pydebug \ --with-openssl=$OPENSSL_DIR - - name: Build CPython out-of-tree - working-directory: ${{ env.CPYTHON_BUILDDIR }} - run: make -j4 - - name: Display build info - working-directory: ${{ env.CPYTHON_BUILDDIR }} - run: make pythoninfo - - name: Remount sources writable for tests - # some tests write to srcdir, lack of pyc files slows down testing - run: sudo mount $CPYTHON_RO_SRCDIR -oremount,rw - - name: Tests - working-directory: ${{ env.CPYTHON_BUILDDIR }} - run: xvfb-run make buildbottest TESTOPTS="-j4 -uall,-cpu" build_ubuntu_ssltests: name: 'Ubuntu SSL tests with OpenSSL' @@ -463,12 +336,10 @@ jobs: - check_source # Transitive dependency, needed to access `run_tests` value - check-docs - check_generated_files - - build_win32 - - build_win_amd64 - - build_win_arm64 - build_macos - build_ubuntu - build_ubuntu_ssltests + - build_windows - build_asan runs-on: ubuntu-latest @@ -480,8 +351,6 @@ jobs: allowed-failures: >- build_macos, build_ubuntu_ssltests, - build_win32, - build_win_arm64, allowed-skips: >- ${{ !fromJSON(needs.check_source.outputs.run-docs) @@ -494,12 +363,10 @@ jobs: needs.check_source.outputs.run_tests != 'true' && ' check_generated_files, - build_win32, - build_win_amd64, - build_win_arm64, build_macos, build_ubuntu, build_ubuntu_ssltests, + build_windows, build_asan, ' || '' diff --git a/.github/workflows/reusable-macos.yml b/.github/workflows/reusable-macos.yml new file mode 100644 index 00000000000000..3ed8303ac16d97 --- /dev/null +++ b/.github/workflows/reusable-macos.yml @@ -0,0 +1,46 @@ +on: + workflow_call: + inputs: + config_hash: + required: true + type: string + free-threaded: + required: false + type: boolean + default: false + +jobs: + build_macos: + name: 'build and test' + runs-on: macos-latest + timeout-minutes: 60 + env: + HOMEBREW_NO_ANALYTICS: 1 + HOMEBREW_NO_AUTO_UPDATE: 1 + HOMEBREW_NO_INSTALL_CLEANUP: 1 + PYTHONSTRICTEXTENSIONBUILD: 1 + steps: + - uses: actions/checkout@v4 + - name: Restore config.cache + uses: actions/cache@v3 + with: + path: config.cache + key: ${{ github.job }}-${{ runner.os }}-${{ inputs.config_hash }} + - name: Install Homebrew dependencies + run: brew install pkg-config openssl@3.0 xz gdbm tcl-tk + - name: Configure CPython + run: | + GDBM_CFLAGS="-I$(brew --prefix gdbm)/include" \ + GDBM_LIBS="-L$(brew --prefix gdbm)/lib -lgdbm" \ + ./configure \ + --config-cache \ + --with-pydebug \ + ${{ inputs.free-threaded && '--disable-gil' || '' }} \ + --prefix=/opt/python-dev \ + --with-openssl="$(brew --prefix openssl@3.0)" + - name: Build CPython + run: make -j4 + - name: Display build info + run: make pythoninfo + - name: Tests + run: make buildbottest TESTOPTS="-j4 -uall,-cpu" diff --git a/.github/workflows/reusable-ubuntu.yml b/.github/workflows/reusable-ubuntu.yml new file mode 100644 index 00000000000000..56268c8bfd3419 --- /dev/null +++ b/.github/workflows/reusable-ubuntu.yml @@ -0,0 +1,71 @@ +on: + workflow_call: + inputs: + config_hash: + required: true + type: string + options: + required: true + type: string + +jobs: + build_ubuntu_reusable: + name: 'build and test' + timeout-minutes: 60 + runs-on: ubuntu-20.04 + env: + OPENSSL_VER: 3.0.11 + PYTHONSTRICTEXTENSIONBUILD: 1 + steps: + - uses: actions/checkout@v4 + - name: Register gcc problem matcher + run: echo "::add-matcher::.github/problem-matchers/gcc.json" + - name: Install dependencies + run: sudo ./.github/workflows/posix-deps-apt.sh + - name: Configure OpenSSL env vars + run: | + echo "MULTISSL_DIR=${GITHUB_WORKSPACE}/multissl" >> $GITHUB_ENV + echo "OPENSSL_DIR=${GITHUB_WORKSPACE}/multissl/openssl/${OPENSSL_VER}" >> $GITHUB_ENV + echo "LD_LIBRARY_PATH=${GITHUB_WORKSPACE}/multissl/openssl/${OPENSSL_VER}/lib" >> $GITHUB_ENV + - name: 'Restore OpenSSL build' + id: cache-openssl + uses: actions/cache@v3 + with: + path: ./multissl/openssl/${{ env.OPENSSL_VER }} + key: ${{ runner.os }}-multissl-openssl-${{ env.OPENSSL_VER }} + - name: Install OpenSSL + if: steps.cache-openssl.outputs.cache-hit != 'true' + run: python3 Tools/ssl/multissltests.py --steps=library --base-directory $MULTISSL_DIR --openssl $OPENSSL_VER --system Linux + - name: Add ccache to PATH + run: | + echo "PATH=/usr/lib/ccache:$PATH" >> $GITHUB_ENV + - name: Configure ccache action + uses: hendrikmuhs/ccache-action@v1.2 + - name: Setup directory envs for out-of-tree builds + run: | + echo "CPYTHON_RO_SRCDIR=$(realpath -m ${GITHUB_WORKSPACE}/../cpython-ro-srcdir)" >> $GITHUB_ENV + echo "CPYTHON_BUILDDIR=$(realpath -m ${GITHUB_WORKSPACE}/../cpython-builddir)" >> $GITHUB_ENV + - name: Create directories for read-only out-of-tree builds + run: mkdir -p $CPYTHON_RO_SRCDIR $CPYTHON_BUILDDIR + - name: Bind mount sources read-only + run: sudo mount --bind -o ro $GITHUB_WORKSPACE $CPYTHON_RO_SRCDIR + - name: Restore config.cache + uses: actions/cache@v3 + with: + path: ${{ env.CPYTHON_BUILDDIR }}/config.cache + key: ${{ github.job }}-${{ runner.os }}-${{ inputs.config_hash }} + - name: Configure CPython out-of-tree + working-directory: ${{ env.CPYTHON_BUILDDIR }} + run: ${{ inputs.options }} + - name: Build CPython out-of-tree + working-directory: ${{ env.CPYTHON_BUILDDIR }} + run: make -j4 + - name: Display build info + working-directory: ${{ env.CPYTHON_BUILDDIR }} + run: make pythoninfo + - name: Remount sources writable for tests + # some tests write to srcdir, lack of pyc files slows down testing + run: sudo mount $CPYTHON_RO_SRCDIR -oremount,rw + - name: Tests + working-directory: ${{ env.CPYTHON_BUILDDIR }} + run: xvfb-run make buildbottest TESTOPTS="-j4 -uall,-cpu" diff --git a/.github/workflows/reusable-windows.yml b/.github/workflows/reusable-windows.yml new file mode 100644 index 00000000000000..98fb7cfa015587 --- /dev/null +++ b/.github/workflows/reusable-windows.yml @@ -0,0 +1,53 @@ +on: + workflow_call: + inputs: + free-threaded: + required: false + type: boolean + default: false + +jobs: + build_win32: + name: 'build and test (x86)' + runs-on: windows-latest + timeout-minutes: 60 + env: + IncludeUwp: 'true' + steps: + - uses: actions/checkout@v4 + - name: Build CPython + run: .\PCbuild\build.bat -e -d -p Win32 ${{ inputs.free-threaded && '--disable-gil' || '' }} + - name: Display build info + run: .\python.bat -m test.pythoninfo + - name: Tests + run: .\PCbuild\rt.bat -p Win32 -d -q -uall -u-cpu -rwW --slowest --timeout=1200 -j0 + + build_win_amd64: + name: 'build and test (x64)' + runs-on: windows-latest + timeout-minutes: 60 + env: + IncludeUwp: 'true' + steps: + - uses: actions/checkout@v4 + - name: Register MSVC problem matcher + run: echo "::add-matcher::.github/problem-matchers/msvc.json" + - name: Build CPython + run: .\PCbuild\build.bat -e -d -p x64 ${{ inputs.free-threaded && '--disable-gil' || '' }} + - name: Display build info + run: .\python.bat -m test.pythoninfo + - name: Tests + run: .\PCbuild\rt.bat -p x64 -d -q -uall -u-cpu -rwW --slowest --timeout=1200 -j0 + + build_win_arm64: + name: 'build (arm64)' + runs-on: windows-latest + timeout-minutes: 60 + env: + IncludeUwp: 'true' + steps: + - uses: actions/checkout@v4 + - name: Register MSVC problem matcher + run: echo "::add-matcher::.github/problem-matchers/msvc.json" + - name: Build CPython + run: .\PCbuild\build.bat -e -d -p arm64 ${{ inputs.free-threaded && '--disable-gil' || '' }} From 132a74922a786aec6c9dc35e6756736c4ebcb966 Mon Sep 17 00:00:00 2001 From: Petr Viktorin Date: Thu, 16 Nov 2023 13:07:53 +0100 Subject: [PATCH 202/265] [3.11] gh-102837: more tests for the math module (GH-111930)(GH-102523) (GH-112030) (GH-112041) [3.12] gh-102837: more tests for the math module (GH-111930)(GH-102523) (GH-112030) * gh-102837: improve test coverage for math module (GH-102523) (Only the test changes from GH-102523 are cherry-picked) - input checks for math_1(L989), math_1a(L1023), math_2(L1064,L1071), hypot(L2682), log(L2307), ldexp(L2168), ceil(L1165), floor(L1236,L1239) and dist(L2587,L2588,L2628). - improve fsum coverage for exceptional cases (L1433,L1438,L1451,L1497), ditto fmod(L2378) (all line numbers are wrt the main branch at 5e6661bce9) * gh-102837: more tests for the math module (GH-111930) Add tests to improve coverage: * fsum: L1369, L1379, L1383, L1412 * trunc: L2081 * log: L2267 * dist: L2577, L2579 * hypot: L2632 * (not cherry-picked for 3.11: sumprod) * pow: L2982 * prod: L3294, L3308, L3318-3330 // line numbers wrt to 9dc4fb8204 (cherry picked from commit c61de456db0186b65d479d41e84127832205d30d) --------- Co-authored-by: Sergey B Kirpichev (cherry picked from commit c6aea46a71d158f993cc723c14b4bf7982b73a2a) --- Lib/test/test_math.py | 80 +++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 80 insertions(+) diff --git a/Lib/test/test_math.py b/Lib/test/test_math.py index bf0d0a56e6ac8b..7edcce9bbde23b 100644 --- a/Lib/test/test_math.py +++ b/Lib/test/test_math.py @@ -234,6 +234,10 @@ def __init__(self, value): def __index__(self): return self.value +class BadDescr: + def __get__(self, obj, objtype=None): + raise ValueError + class MathTests(unittest.TestCase): def ftest(self, name, got, expected, ulp_tol=5, abs_tol=0.0): @@ -323,6 +327,7 @@ def testAtan2(self): self.ftest('atan2(0, 1)', math.atan2(0, 1), 0) self.ftest('atan2(1, 1)', math.atan2(1, 1), math.pi/4) self.ftest('atan2(1, 0)', math.atan2(1, 0), math.pi/2) + self.ftest('atan2(1, -1)', math.atan2(1, -1), 3*math.pi/4) # math.atan2(0, x) self.ftest('atan2(0., -inf)', math.atan2(0., NINF), math.pi) @@ -416,16 +421,22 @@ def __ceil__(self): return 42 class TestNoCeil: pass + class TestBadCeil: + __ceil__ = BadDescr() self.assertEqual(math.ceil(TestCeil()), 42) self.assertEqual(math.ceil(FloatCeil()), 42) self.assertEqual(math.ceil(FloatLike(42.5)), 43) self.assertRaises(TypeError, math.ceil, TestNoCeil()) + self.assertRaises(ValueError, math.ceil, TestBadCeil()) t = TestNoCeil() t.__ceil__ = lambda *args: args self.assertRaises(TypeError, math.ceil, t) self.assertRaises(TypeError, math.ceil, t, 0) + self.assertEqual(math.ceil(FloatLike(+1.0)), +1.0) + self.assertEqual(math.ceil(FloatLike(-1.0)), -1.0) + @requires_IEEE_754 def testCopysign(self): self.assertEqual(math.copysign(1, 42), 1.0) @@ -566,16 +577,22 @@ def __floor__(self): return 42 class TestNoFloor: pass + class TestBadFloor: + __floor__ = BadDescr() self.assertEqual(math.floor(TestFloor()), 42) self.assertEqual(math.floor(FloatFloor()), 42) self.assertEqual(math.floor(FloatLike(41.9)), 41) self.assertRaises(TypeError, math.floor, TestNoFloor()) + self.assertRaises(ValueError, math.floor, TestBadFloor()) t = TestNoFloor() t.__floor__ = lambda *args: args self.assertRaises(TypeError, math.floor, t) self.assertRaises(TypeError, math.floor, t, 0) + self.assertEqual(math.floor(FloatLike(+1.0)), +1.0) + self.assertEqual(math.floor(FloatLike(-1.0)), -1.0) + def testFmod(self): self.assertRaises(TypeError, math.fmod) self.ftest('fmod(10, 1)', math.fmod(10, 1), 0.0) @@ -597,6 +614,7 @@ def testFmod(self): self.assertEqual(math.fmod(-3.0, NINF), -3.0) self.assertEqual(math.fmod(0.0, 3.0), 0.0) self.assertEqual(math.fmod(0.0, NINF), 0.0) + self.assertRaises(ValueError, math.fmod, INF, INF) def testFrexp(self): self.assertRaises(TypeError, math.frexp) @@ -666,6 +684,7 @@ def msum(iterable): ([], 0.0), ([0.0], 0.0), ([1e100, 1.0, -1e100, 1e-100, 1e50, -1.0, -1e50], 1e-100), + ([1e100, 1.0, -1e100, 1e-100, 1e50, -1, -1e50], 1e-100), ([2.0**53, -0.5, -2.0**-54], 2.0**53-1.0), ([2.0**53, 1.0, 2.0**-100], 2.0**53+2.0), ([2.0**53+10.0, 1.0, 2.0**-100], 2.0**53+12.0), @@ -713,6 +732,22 @@ def msum(iterable): s = msum(vals) self.assertEqual(msum(vals), math.fsum(vals)) + self.assertEqual(math.fsum([1.0, math.inf]), math.inf) + self.assertTrue(math.isnan(math.fsum([math.nan, 1.0]))) + self.assertEqual(math.fsum([1e100, FloatLike(1.0), -1e100, 1e-100, + 1e50, FloatLike(-1.0), -1e50]), 1e-100) + self.assertRaises(OverflowError, math.fsum, [1e+308, 1e+308]) + self.assertRaises(ValueError, math.fsum, [math.inf, -math.inf]) + self.assertRaises(TypeError, math.fsum, ['spam']) + self.assertRaises(TypeError, math.fsum, 1) + self.assertRaises(OverflowError, math.fsum, [10**1000]) + + def bad_iter(): + yield 1.0 + raise ZeroDivisionError + + self.assertRaises(ZeroDivisionError, math.fsum, bad_iter()) + def testGcd(self): gcd = math.gcd self.assertEqual(gcd(0, 0), 0) @@ -773,6 +808,8 @@ def testHypot(self): # Test allowable types (those with __float__) self.assertEqual(hypot(12.0, 5.0), 13.0) self.assertEqual(hypot(12, 5), 13) + self.assertEqual(hypot(1, -1), math.sqrt(2)) + self.assertEqual(hypot(1, FloatLike(-1.)), math.sqrt(2)) self.assertEqual(hypot(Decimal(12), Decimal(5)), 13) self.assertEqual(hypot(Fraction(12, 32), Fraction(5, 32)), Fraction(13, 32)) self.assertEqual(hypot(bool(1), bool(0), bool(1), bool(1)), math.sqrt(3)) @@ -830,6 +867,8 @@ def testHypot(self): scale = FLOAT_MIN / 2.0 ** exp self.assertEqual(math.hypot(4*scale, 3*scale), 5*scale) + self.assertRaises(TypeError, math.hypot, *([1.0]*18), 'spam') + @requires_IEEE_754 @unittest.skipIf(HAVE_DOUBLE_ROUNDING, "hypot() loses accuracy on machines with double rounding") @@ -922,6 +961,10 @@ def testDist(self): # Test allowable types (those with __float__) self.assertEqual(dist((14.0, 1.0), (2.0, -4.0)), 13.0) self.assertEqual(dist((14, 1), (2, -4)), 13) + self.assertEqual(dist((FloatLike(14.), 1), (2, -4)), 13) + self.assertEqual(dist((11, 1), (FloatLike(-1.), -4)), 13) + self.assertEqual(dist((14, FloatLike(-1.)), (2, -6)), 13) + self.assertEqual(dist((14, -1), (2, -6)), 13) self.assertEqual(dist((D(14), D(1)), (D(2), D(-4))), D(13)) self.assertEqual(dist((F(14, 32), F(1, 32)), (F(2, 32), F(-4, 32))), F(13, 32)) @@ -965,6 +1008,8 @@ class T(tuple): dist((1, 2, 3, 4), (5, 6, 7)) with self.assertRaises(ValueError): # Check dimension agree dist((1, 2, 3), (4, 5, 6, 7)) + with self.assertRaises(TypeError): + dist((1,)*17 + ("spam",), (1,)*18) with self.assertRaises(TypeError): # Rejects invalid types dist("abc", "xyz") int_too_big_for_float = 10 ** (sys.float_info.max_10_exp + 5) @@ -972,6 +1017,16 @@ class T(tuple): dist((1, int_too_big_for_float), (2, 3)) with self.assertRaises((ValueError, OverflowError)): dist((2, 3), (1, int_too_big_for_float)) + with self.assertRaises(TypeError): + dist((1,), 2) + with self.assertRaises(TypeError): + dist([1], 2) + + class BadFloat: + __float__ = BadDescr() + + with self.assertRaises(ValueError): + dist([1], [BadFloat()]) # Verify that the one dimensional case is equivalent to abs() for i in range(20): @@ -1110,6 +1165,7 @@ def test_lcm(self): def testLdexp(self): self.assertRaises(TypeError, math.ldexp) + self.assertRaises(TypeError, math.ldexp, 2.0, 1.1) self.ftest('ldexp(0,1)', math.ldexp(0,1), 0) self.ftest('ldexp(1,1)', math.ldexp(1,1), 2) self.ftest('ldexp(1,-1)', math.ldexp(1,-1), 0.5) @@ -1142,6 +1198,7 @@ def testLdexp(self): def testLog(self): self.assertRaises(TypeError, math.log) + self.assertRaises(TypeError, math.log, 1, 2, 3) self.ftest('log(1/e)', math.log(1/math.e), -1) self.ftest('log(1)', math.log(1), 0) self.ftest('log(e)', math.log(math.e), 1) @@ -1152,6 +1209,7 @@ def testLog(self): 2302.5850929940457) self.assertRaises(ValueError, math.log, -1.5) self.assertRaises(ValueError, math.log, -10**1000) + self.assertRaises(ValueError, math.log, 10, -10) self.assertRaises(ValueError, math.log, NINF) self.assertEqual(math.log(INF), INF) self.assertTrue(math.isnan(math.log(NAN))) @@ -1235,6 +1293,7 @@ def testPow(self): self.assertTrue(math.isnan(math.pow(2, NAN))) self.assertTrue(math.isnan(math.pow(0, NAN))) self.assertEqual(math.pow(1, NAN), 1) + self.assertRaises(OverflowError, math.pow, 1e+100, 1e+100) # pow(0., x) self.assertEqual(math.pow(0., INF), 0.) @@ -1591,6 +1650,8 @@ def __trunc__(self): return 23 class TestNoTrunc: pass + class TestBadTrunc: + __trunc__ = BadDescr() self.assertEqual(math.trunc(TestTrunc()), 23) self.assertEqual(math.trunc(FloatTrunc()), 23) @@ -1599,6 +1660,7 @@ class TestNoTrunc: self.assertRaises(TypeError, math.trunc, 1, 2) self.assertRaises(TypeError, math.trunc, FloatLike(23.5)) self.assertRaises(TypeError, math.trunc, TestNoTrunc()) + self.assertRaises(ValueError, math.trunc, TestBadTrunc()) def testIsfinite(self): self.assertTrue(math.isfinite(0.0)) @@ -1799,6 +1861,8 @@ def test_mtestfile(self): '\n '.join(failures)) def test_prod(self): + from fractions import Fraction as F + prod = math.prod self.assertEqual(prod([]), 1) self.assertEqual(prod([], start=5), 5) @@ -1810,6 +1874,14 @@ def test_prod(self): self.assertEqual(prod([1.0, 2.0, 3.0, 4.0, 5.0]), 120.0) self.assertEqual(prod([1, 2, 3, 4.0, 5.0]), 120.0) self.assertEqual(prod([1.0, 2.0, 3.0, 4, 5]), 120.0) + self.assertEqual(prod([1., F(3, 2)]), 1.5) + + # Error in multiplication + class BadMultiply: + def __rmul__(self, other): + raise RuntimeError + with self.assertRaises(RuntimeError): + prod([10., BadMultiply()]) # Test overflow in fast-path for integers self.assertEqual(prod([1, 1, 2**32, 1, 1]), 2**32) @@ -2109,6 +2181,14 @@ def __float__(self): # argument to a float. self.assertFalse(getattr(y, "converted", False)) + def test_input_exceptions(self): + self.assertRaises(TypeError, math.exp, "spam") + self.assertRaises(TypeError, math.erf, "spam") + self.assertRaises(TypeError, math.atan2, "spam", 1.0) + self.assertRaises(TypeError, math.atan2, 1.0, "spam") + self.assertRaises(TypeError, math.atan2, 1.0) + self.assertRaises(TypeError, math.atan2, 1.0, 2.0, 3.0) + # Custom assertions. def assertIsNaN(self, value): From b4f3a62383dfad182cd848acef4a530ecd4f9132 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 16 Nov 2023 13:34:21 +0100 Subject: [PATCH 203/265] [3.11] gh-110812: Isolating Extension Modules HOWTO: List GC-related gotchas (GH-111504) (GH-112147) gh-110812: Isolating Extension Modules HOWTO: List GC-related gotchas (GH-111504) (cherry picked from commit 985679f05d1b72965bfbed99d1499c22815375e4) Co-authored-by: Petr Viktorin Co-authored-by: Hugo van Kemenade Co-authored-by: Eric Snow --- Doc/howto/isolating-extensions.rst | 103 +++++++++++++++++++++++++++-- 1 file changed, 97 insertions(+), 6 deletions(-) diff --git a/Doc/howto/isolating-extensions.rst b/Doc/howto/isolating-extensions.rst index 732fd6f993b722..0d4caf3049ec72 100644 --- a/Doc/howto/isolating-extensions.rst +++ b/Doc/howto/isolating-extensions.rst @@ -337,12 +337,44 @@ That is, heap types should: - Define a traverse function using ``Py_tp_traverse``, which visits the type (e.g. using :c:expr:`Py_VISIT(Py_TYPE(self))`). -Please refer to the :ref:`the documentation ` of +Please refer to the the documentation of :c:macro:`Py_TPFLAGS_HAVE_GC` and :c:member:`~PyTypeObject.tp_traverse` for additional considerations. -If your traverse function delegates to the ``tp_traverse`` of its base class -(or another type), ensure that ``Py_TYPE(self)`` is visited only once. +The API for defining heap types grew organically, leaving it +somewhat awkward to use in its current state. +The following sections will guide you through common issues. + + +``tp_traverse`` in Python 3.8 and lower +....................................... + +The requirement to visit the type from ``tp_traverse`` was added in Python 3.9. +If you support Python 3.8 and lower, the traverse function must *not* +visit the type, so it must be more complicated:: + + static int my_traverse(PyObject *self, visitproc visit, void *arg) + { + if (Py_Version >= 0x03090000) { + Py_VISIT(Py_TYPE(self)); + } + return 0; + } + +Unfortunately, :c:data:`Py_Version` was only added in Python 3.11. +As a replacement, use: + +* :c:macro:`PY_VERSION_HEX`, if not using the stable ABI, or +* :py:data:`sys.version_info` (via :c:func:`PySys_GetObject` and + :c:func:`PyArg_ParseTuple`). + + +Delegating ``tp_traverse`` +.......................... + +If your traverse function delegates to the :c:member:`~PyTypeObject.tp_traverse` +of its base class (or another type), ensure that ``Py_TYPE(self)`` is visited +only once. Note that only heap type are expected to visit the type in ``tp_traverse``. For example, if your traverse function includes:: @@ -354,11 +386,70 @@ For example, if your traverse function includes:: if (base->tp_flags & Py_TPFLAGS_HEAPTYPE) { // a heap type's tp_traverse already visited Py_TYPE(self) } else { - Py_VISIT(Py_TYPE(self)); + if (Py_Version >= 0x03090000) { + Py_VISIT(Py_TYPE(self)); + } } -It is not necessary to handle the type's reference count in ``tp_new`` -and ``tp_clear``. +It is not necessary to handle the type's reference count in +:c:member:`~PyTypeObject.tp_new` and :c:member:`~PyTypeObject.tp_clear`. + + +Defining ``tp_dealloc`` +....................... + +If your type has a custom :c:member:`~PyTypeObject.tp_dealloc` function, +it needs to: + +- call :c:func:`PyObject_GC_UnTrack` before any fields are invalidated, and +- decrement the reference count of the type. + +To keep the type valid while ``tp_free`` is called, the type's refcount needs +to be decremented *after* the instance is deallocated. For example:: + + static void my_dealloc(PyObject *self) + { + PyObject_GC_UnTrack(self); + ... + PyTypeObject *type = Py_TYPE(self); + type->tp_free(self); + Py_DECREF(type); + } + +The default ``tp_dealloc`` function does this, so +if your type does *not* override +``tp_dealloc`` you don't need to add it. + + +Not overriding ``tp_free`` +.......................... + +The :c:member:`~PyTypeObject.tp_free` slot of a heap type must be set to +:c:func:`PyObject_GC_Del`. +This is the default; do not override it. + + +Avoiding ``PyObject_New`` +......................... + +GC-tracked objects need to be allocated using GC-aware functions. + +If you use use :c:func:`PyObject_New` or :c:func:`PyObject_NewVar`: + +- Get and call type's :c:member:`~PyTypeObject.tp_alloc` slot, if possible. + That is, replace ``TYPE *o = PyObject_New(TYPE, typeobj)`` with:: + + TYPE *o = typeobj->tp_alloc(typeobj, 0); + + Replace ``o = PyObject_NewVar(TYPE, typeobj, size)`` with the same, + but use size instead of the 0. + +- If the above is not possible (e.g. inside a custom ``tp_alloc``), + call :c:func:`PyObject_GC_New` or :c:func:`PyObject_GC_NewVar`:: + + TYPE *o = PyObject_GC_New(TYPE, typeobj); + + TYPE *o = PyObject_GC_NewVar(TYPE, typeobj, size); Module State Access from Classes From 533da5ed67249305e26e97aa0b8d32f8045cde26 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 16 Nov 2023 18:10:02 +0100 Subject: [PATCH 204/265] [3.11] gh-111811: Fix test_recursive_repr for WASI (GH-112130) (#112132) gh-111811: Fix test_recursive_repr for WASI (GH-112130) (cherry picked from commit 7218bac8c84115a8e9a18a4a8f3146235068facb) Co-authored-by: Kushal Das --- Lib/test/test_xml_etree.py | 1 + 1 file changed, 1 insertion(+) diff --git a/Lib/test/test_xml_etree.py b/Lib/test/test_xml_etree.py index c095dd0b59300c..57f5de34fdc108 100644 --- a/Lib/test/test_xml_etree.py +++ b/Lib/test/test_xml_etree.py @@ -2564,6 +2564,7 @@ def __eq__(self, o): e.extend([ET.Element('bar')]) self.assertRaises(ValueError, e.remove, X('baz')) + @support.infinite_recursion(25) def test_recursive_repr(self): # Issue #25455 e = ET.Element('foo') From 7e4b66b86f7bc43760b4e413bb468b7f87168f0c Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 17 Nov 2023 02:11:39 +0100 Subject: [PATCH 205/265] [3.11] gh-112165: Fix typo in `__main__.py` (GH-112183) (#112185) gh-112165: Fix typo in `__main__.py` (GH-112183) Change '[2]' to '[1]' to get second argument. (cherry picked from commit 8cd70eefc7f3363cfa0d43f34522c3072fa9e160) Co-authored-by: Terry Jan Reedy --- Doc/library/__main__.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/library/__main__.rst b/Doc/library/__main__.rst index 435310d525d754..280af06386addf 100644 --- a/Doc/library/__main__.rst +++ b/Doc/library/__main__.rst @@ -227,7 +227,7 @@ students:: import sys from .student import search_students - student_name = sys.argv[2] if len(sys.argv) >= 2 else '' + student_name = sys.argv[1] if len(sys.argv) >= 2 else '' print(f'Found student: {search_students(student_name)}') Note that ``from .student import search_students`` is an example of a relative From e7aa40a3414b1834fb7aa891d76a81398a0ee828 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 17 Nov 2023 19:14:20 +0100 Subject: [PATCH 206/265] [3.11] gh-112194: Convert more examples to doctests in `typing.py` (GH-112195) (#112209) gh-112194: Convert more examples to doctests in `typing.py` (GH-112195) (cherry picked from commit 949b2cc6eae6ef4f3312dfd4e2650a138446fe77) Co-authored-by: Nikita Sobolev Co-authored-by: Alex Waygood --- Lib/typing.py | 74 ++++++++++++++++++++++++++++++--------------------- 1 file changed, 43 insertions(+), 31 deletions(-) diff --git a/Lib/typing.py b/Lib/typing.py index 977e37d623bcd2..fa812f9bd2355a 100644 --- a/Lib/typing.py +++ b/Lib/typing.py @@ -211,8 +211,12 @@ def _should_unflatten_callable_args(typ, args): For example:: - assert collections.abc.Callable[[int, int], str].__args__ == (int, int, str) - assert collections.abc.Callable[ParamSpec, str].__args__ == (ParamSpec, str) + >>> import collections.abc + >>> P = ParamSpec('P') + >>> collections.abc.Callable[[int, int], str].__args__ == (int, int, str) + True + >>> collections.abc.Callable[P, str].__args__ == (P, str) + True As a result, if we need to reconstruct the Callable from its __args__, we need to unflatten it. @@ -250,7 +254,10 @@ def _collect_parameters(args): For example:: - assert _collect_parameters((T, Callable[P, T])) == (T, P) + >>> P = ParamSpec('P') + >>> T = TypeVar('T') + >>> _collect_parameters((T, Callable[P, T])) + (~T, ~P) """ parameters = [] for t in args: @@ -2417,14 +2424,15 @@ def get_origin(tp): Examples:: - assert get_origin(Literal[42]) is Literal - assert get_origin(int) is None - assert get_origin(ClassVar[int]) is ClassVar - assert get_origin(Generic) is Generic - assert get_origin(Generic[T]) is Generic - assert get_origin(Union[T, int]) is Union - assert get_origin(List[Tuple[T, T]][int]) is list - assert get_origin(P.args) is P + >>> P = ParamSpec('P') + >>> assert get_origin(Literal[42]) is Literal + >>> assert get_origin(int) is None + >>> assert get_origin(ClassVar[int]) is ClassVar + >>> assert get_origin(Generic) is Generic + >>> assert get_origin(Generic[T]) is Generic + >>> assert get_origin(Union[T, int]) is Union + >>> assert get_origin(List[Tuple[T, T]][int]) is list + >>> assert get_origin(P.args) is P """ if isinstance(tp, _AnnotatedAlias): return Annotated @@ -2445,11 +2453,12 @@ def get_args(tp): Examples:: - assert get_args(Dict[str, int]) == (str, int) - assert get_args(int) == () - assert get_args(Union[int, Union[T, int], str][int]) == (int, str) - assert get_args(Union[int, Tuple[T, int]][str]) == (int, Tuple[str, int]) - assert get_args(Callable[[], T][int]) == ([], int) + >>> T = TypeVar('T') + >>> assert get_args(Dict[str, int]) == (str, int) + >>> assert get_args(int) == () + >>> assert get_args(Union[int, Union[T, int], str][int]) == (int, str) + >>> assert get_args(Union[int, Tuple[T, int]][str]) == (int, Tuple[str, int]) + >>> assert get_args(Callable[[], T][int]) == ([], int) """ if isinstance(tp, _AnnotatedAlias): return (tp.__origin__,) + tp.__metadata__ @@ -2468,12 +2477,15 @@ def is_typeddict(tp): For example:: - class Film(TypedDict): - title: str - year: int - - is_typeddict(Film) # => True - is_typeddict(Union[list, str]) # => False + >>> from typing import TypedDict + >>> class Film(TypedDict): + ... title: str + ... year: int + ... + >>> is_typeddict(Film) + True + >>> is_typeddict(dict) + False """ return isinstance(tp, _TypedDictMeta) @@ -3022,15 +3034,15 @@ def TypedDict(typename, fields=None, /, *, total=True, **kwargs): Usage:: - class Point2D(TypedDict): - x: int - y: int - label: str - - a: Point2D = {'x': 1, 'y': 2, 'label': 'good'} # OK - b: Point2D = {'z': 3, 'label': 'bad'} # Fails type check - - assert Point2D(x=1, y=2, label='first') == dict(x=1, y=2, label='first') + >>> class Point2D(TypedDict): + ... x: int + ... y: int + ... label: str + ... + >>> a: Point2D = {'x': 1, 'y': 2, 'label': 'good'} # OK + >>> b: Point2D = {'z': 3, 'label': 'bad'} # Fails type check + >>> Point2D(x=1, y=2, label='first') == dict(x=1, y=2, label='first') + True The type info can be accessed via the Point2D.__annotations__ dict, and the Point2D.__required_keys__ and Point2D.__optional_keys__ frozensets. From 93fbcd644318b83619d069301851419b9238d5b6 Mon Sep 17 00:00:00 2001 From: Nikita Sobolev Date: Sat, 18 Nov 2023 14:13:55 +0300 Subject: [PATCH 207/265] [3.11] gh-112155: Run `typing.py` doctests during tests (GH-112156) (#112231) --- Lib/test/test_typing.py | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index 30056e49ae29d9..12f78b580effaa 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -8466,5 +8466,11 @@ def test_is_not_instance_of_iterable(self): self.assertNotIsInstance(type_to_test, collections.abc.Iterable) +def load_tests(loader, tests, pattern): + import doctest + tests.addTests(doctest.DocTestSuite(typing)) + return tests + + if __name__ == '__main__': main() From e19d75df7617026d4822fcd95de3d4d277d53525 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 19 Nov 2023 05:34:54 +0100 Subject: [PATCH 208/265] [3.11] gh-79871: IDLE - Fix and test debugger module (GH-11451) (#112257) Add docstrings to the debugger module. Fix two bugs: initialize Idb.botframe (should be in Bdb); In Idb.in_rpc_code, check whether prev_frame is None before trying to use it. Make other code changes. Expand test_debugger coverage from 19% to 66%. --------- (cherry picked from commit adedcfa06b553242d8033f6d9bebbcb3bc0dbb4d) Co-authored-by: Anthony Shaw Co-authored-by: Terry Jan Reedy --- Lib/idlelib/debugger.py | 179 ++++++++---- Lib/idlelib/idle_test/test_debugger.py | 276 +++++++++++++++++- Lib/idlelib/pyshell.py | 16 +- Lib/idlelib/stackviewer.py | 2 + .../2019-01-07-06-18-25.bpo-35668.JimxP5.rst | 4 + 5 files changed, 400 insertions(+), 77 deletions(-) create mode 100644 Misc/NEWS.d/next/IDLE/2019-01-07-06-18-25.bpo-35668.JimxP5.rst diff --git a/Lib/idlelib/debugger.py b/Lib/idlelib/debugger.py index a92bb98d908d46..f487b4c4b16a60 100644 --- a/Lib/idlelib/debugger.py +++ b/Lib/idlelib/debugger.py @@ -1,3 +1,20 @@ +"""Debug user code with a GUI interface to a subclass of bdb.Bdb. + +The Idb idb and Debugger gui instances each need a reference to each +other or to an rpc proxy for each other. + +If IDLE is started with '-n', so that user code and idb both run in the +IDLE process, Debugger is called without an idb. Debugger.__init__ +calls Idb with its incomplete self. Idb.__init__ stores gui and gui +then stores idb. + +If IDLE is started normally, so that user code executes in a separate +process, debugger_r.start_remote_debugger is called, executing in the +IDLE process. It calls 'start the debugger' in the remote process, +which calls Idb with a gui proxy. Then Debugger is called in the IDLE +for more. +""" + import bdb import os @@ -10,66 +27,95 @@ class Idb(bdb.Bdb): + "Supply user_line and user_exception functions for Bdb." def __init__(self, gui): - self.gui = gui # An instance of Debugger or proxy of remote. - bdb.Bdb.__init__(self) + self.gui = gui # An instance of Debugger or proxy thereof. + super().__init__() def user_line(self, frame): - if self.in_rpc_code(frame): + """Handle a user stopping or breaking at a line. + + Convert frame to a string and send it to gui. + """ + if _in_rpc_code(frame): self.set_step() return - message = self.__frame2message(frame) + message = _frame2message(frame) try: self.gui.interaction(message, frame) except TclError: # When closing debugger window with [x] in 3.x pass - def user_exception(self, frame, info): - if self.in_rpc_code(frame): + def user_exception(self, frame, exc_info): + """Handle an the occurrence of an exception.""" + if _in_rpc_code(frame): self.set_step() return - message = self.__frame2message(frame) - self.gui.interaction(message, frame, info) - - def in_rpc_code(self, frame): - if frame.f_code.co_filename.count('rpc.py'): - return True - else: - prev_frame = frame.f_back - prev_name = prev_frame.f_code.co_filename - if 'idlelib' in prev_name and 'debugger' in prev_name: - # catch both idlelib/debugger.py and idlelib/debugger_r.py - # on both Posix and Windows - return False - return self.in_rpc_code(prev_frame) - - def __frame2message(self, frame): - code = frame.f_code - filename = code.co_filename - lineno = frame.f_lineno - basename = os.path.basename(filename) - message = f"{basename}:{lineno}" - if code.co_name != "?": - message = f"{message}: {code.co_name}()" - return message + message = _frame2message(frame) + self.gui.interaction(message, frame, exc_info) + +def _in_rpc_code(frame): + "Determine if debugger is within RPC code." + if frame.f_code.co_filename.count('rpc.py'): + return True # Skip this frame. + else: + prev_frame = frame.f_back + if prev_frame is None: + return False + prev_name = prev_frame.f_code.co_filename + if 'idlelib' in prev_name and 'debugger' in prev_name: + # catch both idlelib/debugger.py and idlelib/debugger_r.py + # on both Posix and Windows + return False + return _in_rpc_code(prev_frame) + +def _frame2message(frame): + """Return a message string for frame.""" + code = frame.f_code + filename = code.co_filename + lineno = frame.f_lineno + basename = os.path.basename(filename) + message = f"{basename}:{lineno}" + if code.co_name != "?": + message = f"{message}: {code.co_name}()" + return message class Debugger: - - vstack = vsource = vlocals = vglobals = None + """The debugger interface. + + This class handles the drawing of the debugger window and + the interactions with the underlying debugger session. + """ + vstack = None + vsource = None + vlocals = None + vglobals = None + stackviewer = None + localsviewer = None + globalsviewer = None def __init__(self, pyshell, idb=None): + """Instantiate and draw a debugger window. + + :param pyshell: An instance of the PyShell Window + :type pyshell: :class:`idlelib.pyshell.PyShell` + + :param idb: An instance of the IDLE debugger (optional) + :type idb: :class:`idlelib.debugger.Idb` + """ if idb is None: idb = Idb(self) self.pyshell = pyshell self.idb = idb # If passed, a proxy of remote instance. self.frame = None self.make_gui() - self.interacting = 0 + self.interacting = False self.nesting_level = 0 def run(self, *args): + """Run the debugger.""" # Deal with the scenario where we've already got a program running # in the debugger and we want to start another. If that is the case, # our second 'run' was invoked from an event dispatched not from @@ -104,12 +150,13 @@ def run(self, *args): self.root.after(100, lambda: self.run(*args)) return try: - self.interacting = 1 + self.interacting = True return self.idb.run(*args) finally: - self.interacting = 0 + self.interacting = False def close(self, event=None): + """Close the debugger and window.""" try: self.quit() except Exception: @@ -127,6 +174,7 @@ def close(self, event=None): self.top.destroy() def make_gui(self): + """Draw the debugger gui on the screen.""" pyshell = self.pyshell self.flist = pyshell.flist self.root = root = pyshell.root @@ -135,11 +183,11 @@ def make_gui(self): self.top.wm_iconname("Debug") top.wm_protocol("WM_DELETE_WINDOW", self.close) self.top.bind("", self.close) - # + self.bframe = bframe = Frame(top) self.bframe.pack(anchor="w") self.buttons = bl = [] - # + self.bcont = b = Button(bframe, text="Go", command=self.cont) bl.append(b) self.bstep = b = Button(bframe, text="Step", command=self.step) @@ -150,14 +198,14 @@ def make_gui(self): bl.append(b) self.bret = b = Button(bframe, text="Quit", command=self.quit) bl.append(b) - # + for b in bl: b.configure(state="disabled") b.pack(side="left") - # + self.cframe = cframe = Frame(bframe) self.cframe.pack(side="left") - # + if not self.vstack: self.__class__.vstack = BooleanVar(top) self.vstack.set(1) @@ -180,20 +228,20 @@ def make_gui(self): self.bglobals = Checkbutton(cframe, text="Globals", command=self.show_globals, variable=self.vglobals) self.bglobals.grid(row=1, column=1) - # + self.status = Label(top, anchor="w") self.status.pack(anchor="w") self.error = Label(top, anchor="w") self.error.pack(anchor="w", fill="x") self.errorbg = self.error.cget("background") - # + self.fstack = Frame(top, height=1) self.fstack.pack(expand=1, fill="both") self.flocals = Frame(top) self.flocals.pack(expand=1, fill="both") self.fglobals = Frame(top, height=1) self.fglobals.pack(expand=1, fill="both") - # + if self.vstack.get(): self.show_stack() if self.vlocals.get(): @@ -204,7 +252,7 @@ def make_gui(self): def interaction(self, message, frame, info=None): self.frame = frame self.status.configure(text=message) - # + if info: type, value, tb = info try: @@ -223,28 +271,28 @@ def interaction(self, message, frame, info=None): tb = None bg = self.errorbg self.error.configure(text=m1, background=bg) - # + sv = self.stackviewer if sv: stack, i = self.idb.get_stack(self.frame, tb) sv.load_stack(stack, i) - # + self.show_variables(1) - # + if self.vsource.get(): self.sync_source_line() - # + for b in self.buttons: b.configure(state="normal") - # + self.top.wakeup() # Nested main loop: Tkinter's main loop is not reentrant, so use # Tcl's vwait facility, which reenters the event loop until an - # event handler sets the variable we're waiting on + # event handler sets the variable we're waiting on. self.nesting_level += 1 self.root.tk.call('vwait', '::idledebugwait') self.nesting_level -= 1 - # + for b in self.buttons: b.configure(state="disabled") self.status.configure(text="") @@ -288,8 +336,6 @@ def quit(self): def abort_loop(self): self.root.tk.call('set', '::idledebugwait', '1') - stackviewer = None - def show_stack(self): if not self.stackviewer and self.vstack.get(): self.stackviewer = sv = StackViewer(self.fstack, self.flist, self) @@ -311,9 +357,6 @@ def show_frame(self, stackitem): self.frame = stackitem[0] # lineno is stackitem[1] self.show_variables() - localsviewer = None - globalsviewer = None - def show_locals(self): lv = self.localsviewer if self.vlocals.get(): @@ -354,26 +397,32 @@ def show_variables(self, force=0): if gv: gv.load_dict(gdict, force, self.pyshell.interp.rpcclt) - def set_breakpoint_here(self, filename, lineno): + def set_breakpoint(self, filename, lineno): + """Set a filename-lineno breakpoint in the debugger. + + Called from self.load_breakpoints and EW.setbreakpoint + """ self.idb.set_break(filename, lineno) - def clear_breakpoint_here(self, filename, lineno): + def clear_breakpoint(self, filename, lineno): self.idb.clear_break(filename, lineno) def clear_file_breaks(self, filename): self.idb.clear_all_file_breaks(filename) def load_breakpoints(self): - "Load PyShellEditorWindow breakpoints into subprocess debugger" + """Load PyShellEditorWindow breakpoints into subprocess debugger.""" for editwin in self.pyshell.flist.inversedict: filename = editwin.io.filename try: for lineno in editwin.breakpoints: - self.set_breakpoint_here(filename, lineno) + self.set_breakpoint(filename, lineno) except AttributeError: continue + class StackViewer(ScrolledList): + "Code stack viewer for debugger GUI." def __init__(self, master, flist, gui): if macosx.isAquaTk(): @@ -414,12 +463,12 @@ def load_stack(self, stack, index=None): self.select(index) def popup_event(self, event): - "override base method" + "Override base method." if self.stack: return ScrolledList.popup_event(self, event) def fill_menu(self): - "override base method" + "Override base method." menu = self.menu menu.add_command(label="Go to source line", command=self.goto_source_line) @@ -427,12 +476,12 @@ def fill_menu(self): command=self.show_stack_frame) def on_select(self, index): - "override base method" + "Override base method." if 0 <= index < len(self.stack): self.gui.show_frame(self.stack[index]) def on_double(self, index): - "override base method" + "Override base method." self.show_source(index) def goto_source_line(self): @@ -457,6 +506,7 @@ def show_source(self, index): class NamespaceViewer: + "Global/local namespace viewer for debugger GUI." def __init__(self, master, title, dict=None): width = 0 @@ -544,6 +594,7 @@ def load_dict(self, dict, force=0, rpc_client=None): def close(self): self.frame.destroy() + if __name__ == "__main__": from unittest import main main('idlelib.idle_test.test_debugger', verbosity=2, exit=False) diff --git a/Lib/idlelib/idle_test/test_debugger.py b/Lib/idlelib/idle_test/test_debugger.py index 35efb3411c73b5..db01a893cb1980 100644 --- a/Lib/idlelib/idle_test/test_debugger.py +++ b/Lib/idlelib/idle_test/test_debugger.py @@ -1,11 +1,279 @@ "Test debugger, coverage 19%" from idlelib import debugger -import unittest -from test.support import requires -requires('gui') +from collections import namedtuple +from textwrap import dedent from tkinter import Tk +from test.support import requires +import unittest +from unittest import mock +from unittest.mock import Mock, patch + +"""A test python script for the debug tests.""" +TEST_CODE = dedent(""" + i = 1 + i += 2 + if i == 3: + print(i) + """) + + +class MockFrame: + "Minimal mock frame." + + def __init__(self, code, lineno): + self.f_code = code + self.f_lineno = lineno + + +class IdbTest(unittest.TestCase): + + @classmethod + def setUpClass(cls): + cls.gui = Mock() + cls.idb = debugger.Idb(cls.gui) + + # Create test and code objects to simulate a debug session. + code_obj = compile(TEST_CODE, 'idlelib/file.py', mode='exec') + frame1 = MockFrame(code_obj, 1) + frame1.f_back = None + frame2 = MockFrame(code_obj, 2) + frame2.f_back = frame1 + cls.frame = frame2 + cls.msg = 'file.py:2: ()' + + def test_init(self): + # Test that Idb.__init_ calls Bdb.__init__. + idb = debugger.Idb(None) + self.assertIsNone(idb.gui) + self.assertTrue(hasattr(idb, 'breaks')) + + def test_user_line(self): + # Test that .user_line() creates a string message for a frame. + self.gui.interaction = Mock() + self.idb.user_line(self.frame) + self.gui.interaction.assert_called_once_with(self.msg, self.frame) + + def test_user_exception(self): + # Test that .user_exception() creates a string message for a frame. + exc_info = (type(ValueError), ValueError(), None) + self.gui.interaction = Mock() + self.idb.user_exception(self.frame, exc_info) + self.gui.interaction.assert_called_once_with( + self.msg, self.frame, exc_info) + + +class FunctionTest(unittest.TestCase): + # Test module functions together. + + def test_functions(self): + rpc_obj = compile(TEST_CODE,'rpc.py', mode='exec') + rpc_frame = MockFrame(rpc_obj, 2) + rpc_frame.f_back = rpc_frame + self.assertTrue(debugger._in_rpc_code(rpc_frame)) + self.assertEqual(debugger._frame2message(rpc_frame), + 'rpc.py:2: ()') + + code_obj = compile(TEST_CODE, 'idlelib/debugger.py', mode='exec') + code_frame = MockFrame(code_obj, 1) + code_frame.f_back = None + self.assertFalse(debugger._in_rpc_code(code_frame)) + self.assertEqual(debugger._frame2message(code_frame), + 'debugger.py:1: ()') + + code_frame.f_back = code_frame + self.assertFalse(debugger._in_rpc_code(code_frame)) + code_frame.f_back = rpc_frame + self.assertTrue(debugger._in_rpc_code(code_frame)) + + +class DebuggerTest(unittest.TestCase): + "Tests for Debugger that do not need a real root." + + @classmethod + def setUpClass(cls): + cls.pyshell = Mock() + cls.pyshell.root = Mock() + cls.idb = Mock() + with patch.object(debugger.Debugger, 'make_gui'): + cls.debugger = debugger.Debugger(cls.pyshell, cls.idb) + cls.debugger.root = Mock() + + def test_cont(self): + self.debugger.cont() + self.idb.set_continue.assert_called_once() + + def test_step(self): + self.debugger.step() + self.idb.set_step.assert_called_once() + + def test_quit(self): + self.debugger.quit() + self.idb.set_quit.assert_called_once() + + def test_next(self): + with patch.object(self.debugger, 'frame') as frame: + self.debugger.next() + self.idb.set_next.assert_called_once_with(frame) + + def test_ret(self): + with patch.object(self.debugger, 'frame') as frame: + self.debugger.ret() + self.idb.set_return.assert_called_once_with(frame) + + def test_clear_breakpoint(self): + self.debugger.clear_breakpoint('test.py', 4) + self.idb.clear_break.assert_called_once_with('test.py', 4) + + def test_clear_file_breaks(self): + self.debugger.clear_file_breaks('test.py') + self.idb.clear_all_file_breaks.assert_called_once_with('test.py') + + def test_set_load_breakpoints(self): + # Test the .load_breakpoints() method calls idb. + FileIO = namedtuple('FileIO', 'filename') + + class MockEditWindow(object): + def __init__(self, fn, breakpoints): + self.io = FileIO(fn) + self.breakpoints = breakpoints + + self.pyshell.flist = Mock() + self.pyshell.flist.inversedict = ( + MockEditWindow('test1.py', [4, 4]), + MockEditWindow('test2.py', [13, 44, 45]), + ) + self.debugger.set_breakpoint('test0.py', 1) + self.idb.set_break.assert_called_once_with('test0.py', 1) + self.debugger.load_breakpoints() # Call set_breakpoint 5 times. + self.idb.set_break.assert_has_calls( + [mock.call('test0.py', 1), + mock.call('test1.py', 4), + mock.call('test1.py', 4), + mock.call('test2.py', 13), + mock.call('test2.py', 44), + mock.call('test2.py', 45)]) + + def test_sync_source_line(self): + # Test that .sync_source_line() will set the flist.gotofileline with fixed frame. + test_code = compile(TEST_CODE, 'test_sync.py', 'exec') + test_frame = MockFrame(test_code, 1) + self.debugger.frame = test_frame + + self.debugger.flist = Mock() + with patch('idlelib.debugger.os.path.exists', return_value=True): + self.debugger.sync_source_line() + self.debugger.flist.gotofileline.assert_called_once_with('test_sync.py', 1) + + +class DebuggerGuiTest(unittest.TestCase): + """Tests for debugger.Debugger that need tk root. + + close needs debugger.top set in make_gui. + """ + + @classmethod + def setUpClass(cls): + requires('gui') + cls.root = root = Tk() + root.withdraw() + cls.pyshell = Mock() + cls.pyshell.root = root + cls.idb = Mock() +# stack tests fail with debugger here. +## cls.debugger = debugger.Debugger(cls.pyshell, cls.idb) +## cls.debugger.root = root +## # real root needed for real make_gui +## # run, interacting, abort_loop + + @classmethod + def tearDownClass(cls): + cls.root.destroy() + del cls.root + + def setUp(self): + self.debugger = debugger.Debugger(self.pyshell, self.idb) + self.debugger.root = self.root + # real root needed for real make_gui + # run, interacting, abort_loop + + def test_run_debugger(self): + self.debugger.run(1, 'two') + self.idb.run.assert_called_once_with(1, 'two') + self.assertEqual(self.debugger.interacting, 0) + + def test_close(self): + # Test closing the window in an idle state. + self.debugger.close() + self.pyshell.close_debugger.assert_called_once() + + def test_show_stack(self): + self.debugger.show_stack() + self.assertEqual(self.debugger.stackviewer.gui, self.debugger) + + def test_show_stack_with_frame(self): + test_frame = MockFrame(None, None) + self.debugger.frame = test_frame + + # Reset the stackviewer to force it to be recreated. + self.debugger.stackviewer = None + self.idb.get_stack.return_value = ([], 0) + self.debugger.show_stack() + + # Check that the newly created stackviewer has the test gui as a field. + self.assertEqual(self.debugger.stackviewer.gui, self.debugger) + self.idb.get_stack.assert_called_once_with(test_frame, None) + + +class StackViewerTest(unittest.TestCase): + + @classmethod + def setUpClass(cls): + requires('gui') + cls.root = Tk() + cls.root.withdraw() + + @classmethod + def tearDownClass(cls): + cls.root.destroy() + del cls.root + + def setUp(self): + self.code = compile(TEST_CODE, 'test_stackviewer.py', 'exec') + self.stack = [ + (MockFrame(self.code, 1), 1), + (MockFrame(self.code, 2), 2) + ] + # Create a stackviewer and load the test stack. + self.sv = debugger.StackViewer(self.root, None, None) + self.sv.load_stack(self.stack) + + def test_init(self): + # Test creation of StackViewer. + gui = None + flist = None + master_window = self.root + sv = debugger.StackViewer(master_window, flist, gui) + self.assertTrue(hasattr(sv, 'stack')) + + def test_load_stack(self): + # Test the .load_stack() method against a fixed test stack. + # Check the test stack is assigned and the list contains the repr of them. + self.assertEqual(self.sv.stack, self.stack) + self.assertTrue('?.(), line 1:' in self.sv.get(0)) + self.assertEqual(self.sv.get(1), '?.(), line 2: ') + + def test_show_source(self): + # Test the .show_source() method against a fixed test stack. + # Patch out the file list to monitor it + self.sv.flist = Mock() + # Patch out isfile to pretend file exists. + with patch('idlelib.debugger.os.path.isfile', return_value=True) as isfile: + self.sv.show_source(1) + isfile.assert_called_once_with('test_stackviewer.py') + self.sv.flist.open.assert_called_once_with('test_stackviewer.py') + class NameSpaceTest(unittest.TestCase): @@ -23,7 +291,5 @@ def test_init(self): debugger.NamespaceViewer(self.root, 'Test') -# Other classes are Idb, Debugger, and StackViewer. - if __name__ == '__main__': unittest.main(verbosity=2) diff --git a/Lib/idlelib/pyshell.py b/Lib/idlelib/pyshell.py index 0a7ee59f730a56..3c3d715e9894a8 100755 --- a/Lib/idlelib/pyshell.py +++ b/Lib/idlelib/pyshell.py @@ -133,8 +133,8 @@ class PyShellEditorWindow(EditorWindow): def __init__(self, *args): self.breakpoints = [] EditorWindow.__init__(self, *args) - self.text.bind("<>", self.set_breakpoint_here) - self.text.bind("<>", self.clear_breakpoint_here) + self.text.bind("<>", self.set_breakpoint_event) + self.text.bind("<>", self.clear_breakpoint_event) self.text.bind("<>", self.flist.open_shell) #TODO: don't read/write this from/to .idlerc when testing @@ -155,8 +155,8 @@ def filename_changed_hook(old_hook=self.io.filename_change_hook, ("Copy", "<>", "rmenu_check_copy"), ("Paste", "<>", "rmenu_check_paste"), (None, None, None), - ("Set Breakpoint", "<>", None), - ("Clear Breakpoint", "<>", None) + ("Set Breakpoint", "<>", None), + ("Clear Breakpoint", "<>", None) ] def color_breakpoint_text(self, color=True): @@ -181,11 +181,11 @@ def set_breakpoint(self, lineno): self.breakpoints.append(lineno) try: # update the subprocess debugger debug = self.flist.pyshell.interp.debugger - debug.set_breakpoint_here(filename, lineno) + debug.set_breakpoint(filename, lineno) except: # but debugger may not be active right now.... pass - def set_breakpoint_here(self, event=None): + def set_breakpoint_event(self, event=None): text = self.text filename = self.io.filename if not filename: @@ -194,7 +194,7 @@ def set_breakpoint_here(self, event=None): lineno = int(float(text.index("insert"))) self.set_breakpoint(lineno) - def clear_breakpoint_here(self, event=None): + def clear_breakpoint_event(self, event=None): text = self.text filename = self.io.filename if not filename: @@ -209,7 +209,7 @@ def clear_breakpoint_here(self, event=None): "insert lineend +1char") try: debug = self.flist.pyshell.interp.debugger - debug.clear_breakpoint_here(filename, lineno) + debug.clear_breakpoint(filename, lineno) except: pass diff --git a/Lib/idlelib/stackviewer.py b/Lib/idlelib/stackviewer.py index 4858cc682a4f45..f8e60fd9b6d818 100644 --- a/Lib/idlelib/stackviewer.py +++ b/Lib/idlelib/stackviewer.py @@ -1,3 +1,5 @@ +# Rename to stackbrowser or possibly consolidate with browser. + import linecache import os diff --git a/Misc/NEWS.d/next/IDLE/2019-01-07-06-18-25.bpo-35668.JimxP5.rst b/Misc/NEWS.d/next/IDLE/2019-01-07-06-18-25.bpo-35668.JimxP5.rst new file mode 100644 index 00000000000000..8bb5420517d55f --- /dev/null +++ b/Misc/NEWS.d/next/IDLE/2019-01-07-06-18-25.bpo-35668.JimxP5.rst @@ -0,0 +1,4 @@ +Add docstrings to the IDLE debugger module. Fix two bugs: +initialize Idb.botframe (should be in Bdb); in Idb.in_rpc_code, +check whether prev_frame is None before trying to use it. +Greatly expand test_debugger. From d065e30b00cc4ba3db33c45563341a33588e7bb9 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 19 Nov 2023 07:59:56 +0100 Subject: [PATCH 209/265] [3.11] IDLE: Fix test_debugger bug and buildbot failures (GH-112258) (#112260) IDLE: Fix test_debugger bug and buildbot failures (GH-112258) Missing "requires('gui')" causes Tk() to fail when no gui. This caused CI Hypothesis test to fail, but I did not understand the its error message. Then buildbots failed. IdbTest failed on draft Bdb replacement because so different. Simplified version works on old and new. (cherry picked from commit 14fd86a59d0d91fe72641efeb14a59d99127dec3) Co-authored-by: Terry Jan Reedy --- Lib/idlelib/idle_test/test_debugger.py | 12 +++++++----- 1 file changed, 7 insertions(+), 5 deletions(-) diff --git a/Lib/idlelib/idle_test/test_debugger.py b/Lib/idlelib/idle_test/test_debugger.py index db01a893cb1980..d1c9638dd5d711 100644 --- a/Lib/idlelib/idle_test/test_debugger.py +++ b/Lib/idlelib/idle_test/test_debugger.py @@ -1,4 +1,7 @@ -"Test debugger, coverage 19%" +"""Test debugger, coverage 66% + +Try to make tests pass with draft bdbx, which may replace bdb in 3.13+. +""" from idlelib import debugger from collections import namedtuple @@ -44,10 +47,8 @@ def setUpClass(cls): cls.msg = 'file.py:2: ()' def test_init(self): - # Test that Idb.__init_ calls Bdb.__init__. - idb = debugger.Idb(None) - self.assertIsNone(idb.gui) - self.assertTrue(hasattr(idb, 'breaks')) + self.assertIs(self.idb.gui, self.gui) + # Won't test super call since two Bdbs are very different. def test_user_line(self): # Test that .user_line() creates a string message for a frame. @@ -279,6 +280,7 @@ class NameSpaceTest(unittest.TestCase): @classmethod def setUpClass(cls): + requires('gui') cls.root = Tk() cls.root.withdraw() From e9a97c3bf9a516607d11904e55f9004710b1493b Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 19 Nov 2023 11:02:49 +0100 Subject: [PATCH 210/265] [3.11] gh-110383: Fix documentation profile cumtime fix (GH-112221) (#112263) Co-authored-by: Alex Ptakhin Co-authored-by: Hugo van Kemenade --- Doc/library/profile.rst | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/Doc/library/profile.rst b/Doc/library/profile.rst index 4c60a1e0d781b0..cc059b66fcb84b 100644 --- a/Doc/library/profile.rst +++ b/Doc/library/profile.rst @@ -82,8 +82,8 @@ the following:: The first line indicates that 214 calls were monitored. Of those calls, 207 were :dfn:`primitive`, meaning that the call was not induced via recursion. The -next line: ``Ordered by: cumulative time``, indicates that the text string in the -far right column was used to sort the output. The column headings include: +next line: ``Ordered by: cumulative time`` indicates the output is sorted +by the ``cumtime`` values. The column headings include: ncalls for the number of calls. From 11b91be55b2d77caf64fb97831452f23f219e433 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 19 Nov 2023 13:29:28 +0100 Subject: [PATCH 211/265] [3.11] gh-110383: Explained which error message is generated when there is an unhandled exception (GH-111574) (#112265) Co-authored-by: Unique-Usman <86585626+Unique-Usman@users.noreply.github.com> Co-authored-by: Hugo van Kemenade --- Doc/tutorial/errors.rst | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/Doc/tutorial/errors.rst b/Doc/tutorial/errors.rst index 1ec59767e9ce12..4058ebe8efdb42 100644 --- a/Doc/tutorial/errors.rst +++ b/Doc/tutorial/errors.rst @@ -108,8 +108,7 @@ The :keyword:`try` statement works as follows. * If an exception occurs which does not match the exception named in the *except clause*, it is passed on to outer :keyword:`try` statements; if no handler is - found, it is an *unhandled exception* and execution stops with a message as - shown above. + found, it is an *unhandled exception* and execution stops with an error message. A :keyword:`try` statement may have more than one *except clause*, to specify handlers for different exceptions. At most one handler will be executed. From f6e11eab491b63ded24607a28cf53be056bdb926 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 20 Nov 2023 00:25:40 +0100 Subject: [PATCH 212/265] [3.11] gh-73561: Omit interface scope from IPv6 when used as Host header (GH-93324) (#112273) gh-73561: Omit interface scope from IPv6 when used as Host header (GH-93324) Omit the `@interface_scope` from an IPv6 address when used as Host header by `http.client`. --------- (cherry picked from commit ce1096f974d3158a92e050f9226700775b8db398) [Google LLC] Co-authored-by: Michael <35783820+mib1185@users.noreply.github.com> --- Lib/http/client.py | 12 ++++++++++-- Lib/test/test_httplib.py | 16 ++++++++++++++++ ...2022-05-28-20-55-07.gh-issue-73561.YRmAvy.rst | 1 + 3 files changed, 27 insertions(+), 2 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2022-05-28-20-55-07.gh-issue-73561.YRmAvy.rst diff --git a/Lib/http/client.py b/Lib/http/client.py index 1c7f4639db5888..87bca4d76aba92 100644 --- a/Lib/http/client.py +++ b/Lib/http/client.py @@ -172,6 +172,13 @@ def _encode(data, name='data'): "if you want to send it encoded in UTF-8." % (name.title(), data[err.start:err.end], name)) from None +def _strip_ipv6_iface(enc_name: bytes) -> bytes: + """Remove interface scope from IPv6 address.""" + enc_name, percent, _ = enc_name.partition(b"%") + if percent: + assert enc_name.startswith(b'['), enc_name + enc_name += b']' + return enc_name class HTTPMessage(email.message.Message): # XXX The only usage of this method is in @@ -1161,7 +1168,7 @@ def putrequest(self, method, url, skip_host=False, netloc_enc = netloc.encode("ascii") except UnicodeEncodeError: netloc_enc = netloc.encode("idna") - self.putheader('Host', netloc_enc) + self.putheader('Host', _strip_ipv6_iface(netloc_enc)) else: if self._tunnel_host: host = self._tunnel_host @@ -1178,8 +1185,9 @@ def putrequest(self, method, url, skip_host=False, # As per RFC 273, IPv6 address should be wrapped with [] # when used as Host header - if host.find(':') >= 0: + if ":" in host: host_enc = b'[' + host_enc + b']' + host_enc = _strip_ipv6_iface(host_enc) if port == self.default_port: self.putheader('Host', host_enc) diff --git a/Lib/test/test_httplib.py b/Lib/test/test_httplib.py index f6a9c820b54d31..f9da4927f682e9 100644 --- a/Lib/test/test_httplib.py +++ b/Lib/test/test_httplib.py @@ -285,6 +285,22 @@ def test_ipv6host_header(self): conn.request('GET', '/foo') self.assertTrue(sock.data.startswith(expected)) + expected = b'GET /foo HTTP/1.1\r\nHost: [fe80::]\r\n' \ + b'Accept-Encoding: identity\r\n\r\n' + conn = client.HTTPConnection('[fe80::%2]') + sock = FakeSocket('') + conn.sock = sock + conn.request('GET', '/foo') + self.assertTrue(sock.data.startswith(expected)) + + expected = b'GET /foo HTTP/1.1\r\nHost: [fe80::]:81\r\n' \ + b'Accept-Encoding: identity\r\n\r\n' + conn = client.HTTPConnection('[fe80::%2]:81') + sock = FakeSocket('') + conn.sock = sock + conn.request('GET', '/foo') + self.assertTrue(sock.data.startswith(expected)) + def test_malformed_headers_coped_with(self): # Issue 19996 body = "HTTP/1.1 200 OK\r\nFirst: val\r\n: nval\r\nSecond: val\r\n\r\n" diff --git a/Misc/NEWS.d/next/Library/2022-05-28-20-55-07.gh-issue-73561.YRmAvy.rst b/Misc/NEWS.d/next/Library/2022-05-28-20-55-07.gh-issue-73561.YRmAvy.rst new file mode 100644 index 00000000000000..5e00b7d20b8ca8 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2022-05-28-20-55-07.gh-issue-73561.YRmAvy.rst @@ -0,0 +1 @@ +Omit the interface scope from an IPv6 address when used as Host header by :mod:`http.client`. From 6c51c84b39f210668fbfacce9e49bff3e2fbe3a7 Mon Sep 17 00:00:00 2001 From: Nikita Sobolev Date: Mon, 20 Nov 2023 12:04:38 +0300 Subject: [PATCH 213/265] [3.11] gh-112266: Remove `(if defined)` part from `__dict__` and `__weakref__` docstrings (GH-112268) (#112276) --- Lib/test/test_pydoc.py | 28 +++++++++---------- ...-11-19-15-57-23.gh-issue-112266.BSJMbR.rst | 2 ++ Objects/typeobject.c | 8 +++--- 3 files changed, 20 insertions(+), 18 deletions(-) create mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-11-19-15-57-23.gh-issue-112266.BSJMbR.rst diff --git a/Lib/test/test_pydoc.py b/Lib/test/test_pydoc.py index be224feca872ed..cbf93a88af9234 100644 --- a/Lib/test/test_pydoc.py +++ b/Lib/test/test_pydoc.py @@ -40,8 +40,8 @@ class nonascii: if test.support.HAVE_DOCSTRINGS: expected_data_docstrings = ( - 'dictionary for instance variables (if defined)', - 'list of weak references to the object (if defined)', + 'dictionary for instance variables', + 'list of weak references to the object', ) * 2 else: expected_data_docstrings = ('', '', '', '') @@ -105,10 +105,10 @@ class C(builtins.object) | Data descriptors defined here: |\x20\x20 | __dict__ - | dictionary for instance variables (if defined) + | dictionary for instance variables |\x20\x20 | __weakref__ - | list of weak references to the object (if defined) + | list of weak references to the object FUNCTIONS doc_func() @@ -166,16 +166,16 @@ class A(builtins.object) Data descriptors defined here: __dict__ - dictionary for instance variables (if defined) + dictionary for instance variables __weakref__ - list of weak references to the object (if defined) + list of weak references to the object class B(builtins.object) Data descriptors defined here: __dict__ - dictionary for instance variables (if defined) + dictionary for instance variables __weakref__ - list of weak references to the object (if defined) + list of weak references to the object Data and other attributes defined here: NO_MEANING = 'eggs' __annotations__ = {'NO_MEANING': } @@ -192,9 +192,9 @@ class C(builtins.object) __class_getitem__(item) from builtins.type Data descriptors defined here: __dict__ - dictionary for instance variables (if defined) + dictionary for instance variables __weakref__ - list of weak references to the object (if defined) + list of weak references to the object Functions doc_func() @@ -826,10 +826,10 @@ class B(A) | Data descriptors inherited from A: |\x20\x20 | __dict__ - | dictionary for instance variables (if defined) + | dictionary for instance variables |\x20\x20 | __weakref__ - | list of weak references to the object (if defined) + | list of weak references to the object ''' % __name__) doc = pydoc.render_doc(B, renderer=pydoc.HTMLDoc()) @@ -858,9 +858,9 @@ class B(A) Data descriptors inherited from A: __dict__ - dictionary for instance variables (if defined) + dictionary for instance variables __weakref__ - list of weak references to the object (if defined) + list of weak references to the object """ as_text = html2text(doc) expected_lines = [line.strip() for line in expected_text.split("\n") if line] diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-11-19-15-57-23.gh-issue-112266.BSJMbR.rst b/Misc/NEWS.d/next/Core and Builtins/2023-11-19-15-57-23.gh-issue-112266.BSJMbR.rst new file mode 100644 index 00000000000000..18433db9bb976e --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2023-11-19-15-57-23.gh-issue-112266.BSJMbR.rst @@ -0,0 +1,2 @@ +Change docstrings of :attr:`~object.__dict__` and +:attr:`~object.__weakref__`. diff --git a/Objects/typeobject.c b/Objects/typeobject.c index 4bdf1881581fc3..8c2e725cac9d2e 100644 --- a/Objects/typeobject.c +++ b/Objects/typeobject.c @@ -2399,21 +2399,21 @@ subtype_getweakref(PyObject *obj, void *context) static PyGetSetDef subtype_getsets_full[] = { {"__dict__", subtype_dict, subtype_setdict, - PyDoc_STR("dictionary for instance variables (if defined)")}, + PyDoc_STR("dictionary for instance variables")}, {"__weakref__", subtype_getweakref, NULL, - PyDoc_STR("list of weak references to the object (if defined)")}, + PyDoc_STR("list of weak references to the object")}, {0} }; static PyGetSetDef subtype_getsets_dict_only[] = { {"__dict__", subtype_dict, subtype_setdict, - PyDoc_STR("dictionary for instance variables (if defined)")}, + PyDoc_STR("dictionary for instance variables")}, {0} }; static PyGetSetDef subtype_getsets_weakref_only[] = { {"__weakref__", subtype_getweakref, NULL, - PyDoc_STR("list of weak references to the object (if defined)")}, + PyDoc_STR("list of weak references to the object")}, {0} }; From 13975051b5ce3cee6de841716165a78389efb58f Mon Sep 17 00:00:00 2001 From: DPR Date: Tue, 21 Nov 2023 07:12:17 +0800 Subject: [PATCH 214/265] [3.11] gh-109538: Catch closed loop runtime error and issue warning (GH-111983) (#112141) Issue a ResourceWarning instead. Co-authored-by: Hugo van Kemenade (cherry picked from commit e0f512797596282bff63260f8102592aad37cdf1) --- Lib/asyncio/streams.py | 7 ++- Lib/test/test_asyncio/test_streams.py | 56 +++++++++++++++++++ ...-11-11-16-42-48.gh-issue-109538.cMG5ux.rst | 1 + 3 files changed, 62 insertions(+), 2 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-11-11-16-42-48.gh-issue-109538.cMG5ux.rst diff --git a/Lib/asyncio/streams.py b/Lib/asyncio/streams.py index 7c407067e05a16..23b6e4c32f7c68 100644 --- a/Lib/asyncio/streams.py +++ b/Lib/asyncio/streams.py @@ -404,8 +404,11 @@ async def start_tls(self, sslcontext, *, def __del__(self): if not self._transport.is_closing(): - self.close() - + if self._loop.is_closed(): + warnings.warn("loop is closed", ResourceWarning) + else: + self.close() + warnings.warn(f"unclosed {self!r}", ResourceWarning) class StreamReader: diff --git a/Lib/test/test_asyncio/test_streams.py b/Lib/test/test_asyncio/test_streams.py index 86354306f1fff3..f5decbe53dacb3 100644 --- a/Lib/test/test_asyncio/test_streams.py +++ b/Lib/test/test_asyncio/test_streams.py @@ -1067,6 +1067,62 @@ def test_eof_feed_when_closing_writer(self): self.assertEqual(messages, []) + def test_unclosed_resource_warnings(self): + async def inner(httpd): + rd, wr = await asyncio.open_connection(*httpd.address) + + wr.write(b'GET / HTTP/1.0\r\n\r\n') + data = await rd.readline() + self.assertEqual(data, b'HTTP/1.0 200 OK\r\n') + data = await rd.read() + self.assertTrue(data.endswith(b'\r\n\r\nTest message')) + with self.assertWarns(ResourceWarning) as cm: + del wr + gc.collect() + self.assertEqual(len(cm.warnings), 1) + self.assertTrue(str(cm.warnings[0].message).startswith("unclosed None: port = socket_helper.find_unused_port() diff --git a/Misc/NEWS.d/next/Library/2023-11-11-16-42-48.gh-issue-109538.cMG5ux.rst b/Misc/NEWS.d/next/Library/2023-11-11-16-42-48.gh-issue-109538.cMG5ux.rst new file mode 100644 index 00000000000000..d1ee4c054a3f19 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-11-11-16-42-48.gh-issue-109538.cMG5ux.rst @@ -0,0 +1 @@ +Issue warning message instead of having :class:`RuntimeError` be displayed when event loop has already been closed at :meth:`StreamWriter.__del__`. From 253ad788c65f33b9bf91aaa956801335e725ae84 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 21 Nov 2023 09:29:34 +0100 Subject: [PATCH 215/265] [3.11] gh-110950: add upstream Tk fixes to macOS installer. (GH-111041) (#112293) Add upstream Tk patches for three problems affecting tkinter users: - Update macOS installer to include a fix accepted by upstream Tcl/Tk for a crash encountered after the first :meth:`tkinter.Tk` instance is destroyed. (gh-92603) - Update macOS installer to include an upstream Tcl/Tk fix for the ``ttk::ThemeChanged`` error encountered in Tkinter. (gh-71383) - Update macOS installer to include an upstream Tcl/Tk fix for the ``Secure coding is not enabled for restorable state!`` warning encountered in Tkinter on macOS 14 Sonoma. (gh-110950) (cherry picked from commit d67f947c72af8a215db2fd285e5de9b1e671fde1) Co-authored-by: Christopher Chavez Co-authored-by: Ned Deily --- Mac/BuildScript/backport_gh110950_fix.patch | 25 ++++++ Mac/BuildScript/backport_gh71383_fix.patch | 89 +++++++++++++++++++ Mac/BuildScript/backport_gh92603_fix.patch | 82 +++++++++++++++++ Mac/BuildScript/build-installer.py | 4 +- ...3-08-30-16-33-57.gh-issue-92603.ATkKVO.rst | 3 + ...3-09-02-08-49-57.gh-issue-71383.Ttkchg.rst | 2 + ...-10-18-17-26-36.gh-issue-110950.sonoma.rst | 3 + 7 files changed, 206 insertions(+), 2 deletions(-) create mode 100644 Mac/BuildScript/backport_gh110950_fix.patch create mode 100644 Mac/BuildScript/backport_gh71383_fix.patch create mode 100644 Mac/BuildScript/backport_gh92603_fix.patch create mode 100644 Misc/NEWS.d/next/macOS/2023-08-30-16-33-57.gh-issue-92603.ATkKVO.rst create mode 100644 Misc/NEWS.d/next/macOS/2023-09-02-08-49-57.gh-issue-71383.Ttkchg.rst create mode 100644 Misc/NEWS.d/next/macOS/2023-10-18-17-26-36.gh-issue-110950.sonoma.rst diff --git a/Mac/BuildScript/backport_gh110950_fix.patch b/Mac/BuildScript/backport_gh110950_fix.patch new file mode 100644 index 00000000000000..793f5a78d8c6ab --- /dev/null +++ b/Mac/BuildScript/backport_gh110950_fix.patch @@ -0,0 +1,25 @@ +From https://core.tcl-lang.org/tk/info/ed7cfbac8db11aa0 + +Note: the diff here is hand-tweaked so that it applies cleanly to both Tk 8.6.8 and 8.6.13. + +diff --git a/macosx/tkMacOSXInit.c b/macosx/tkMacOSXInit.c +index 71d7c3385..e6a68356c 100644 +--- a/macosx/tkMacOSXInit.c ++++ b/macosx/tkMacOSXInit.c +@@ -128,6 +128,16 @@ static int TkMacOSXGetAppPathCmd(ClientData cd, Tcl_Interp *ip, + observe(NSApplicationDidChangeScreenParametersNotification, displayChanged:); + observe(NSTextInputContextKeyboardSelectionDidChangeNotification, keyboardChanged:); + #undef observe ++} ++ ++ ++/* ++ * Fix for 10b38a7a7c. ++ */ ++ ++- (BOOL)applicationSupportsSecureRestorableState:(NSApplication *)app ++{ ++ return YES; + } + + -(void)applicationWillFinishLaunching:(NSNotification *)aNotification diff --git a/Mac/BuildScript/backport_gh71383_fix.patch b/Mac/BuildScript/backport_gh71383_fix.patch new file mode 100644 index 00000000000000..d77b1045e8c165 --- /dev/null +++ b/Mac/BuildScript/backport_gh71383_fix.patch @@ -0,0 +1,89 @@ +Adapted from https://core.tcl-lang.org/tk/info/b1876b9ebc4b + +Index: generic/tkInt.h +================================================================== +--- a/generic/tkInt.h.orig ++++ b/generic/tkInt.h +@@ -1094,10 +1094,11 @@ + /* + * Themed widget set init function: + */ + + MODULE_SCOPE int Ttk_Init(Tcl_Interp *interp); ++MODULE_SCOPE void Ttk_TkDestroyedHandler(Tcl_Interp *interp); + + /* + * Internal functions shared among Tk modules but not exported to the outside + * world: + */ + +Index: generic/tkWindow.c +================================================================== +--- a/generic/tkWindow.c.orig ++++ b/generic/tkWindow.c +@@ -1619,10 +1619,11 @@ + TkBindFree(winPtr->mainPtr); + TkDeleteAllImages(winPtr->mainPtr); + TkFontPkgFree(winPtr->mainPtr); + TkFocusFree(winPtr->mainPtr); + TkStylePkgFree(winPtr->mainPtr); ++ Ttk_TkDestroyedHandler(winPtr->mainPtr->interp); + + /* + * When embedding Tk into other applications, make sure that all + * destroy events reach the server. Otherwise the embedding + * application may also attempt to destroy the windows, resulting + +Index: generic/ttk/ttkTheme.c +================================================================== +--- a/generic/ttk/ttkTheme.c.orig ++++ b/generic/ttk/ttkTheme.c +@@ -415,17 +415,10 @@ + StylePackageData *pkgPtr = (StylePackageData *)clientData; + Tcl_HashSearch search; + Tcl_HashEntry *entryPtr; + Cleanup *cleanup; + +- /* +- * Cancel any pending ThemeChanged calls: +- */ +- if (pkgPtr->themeChangePending) { +- Tcl_CancelIdleCall(ThemeChangedProc, pkgPtr); +- } +- + /* + * Free themes. + */ + entryPtr = Tcl_FirstHashEntry(&pkgPtr->themeTable, &search); + while (entryPtr != NULL) { +@@ -528,10 +521,29 @@ + if (!pkgPtr->themeChangePending) { + Tcl_DoWhenIdle(ThemeChangedProc, pkgPtr); + pkgPtr->themeChangePending = 1; + } + } ++ ++/* Ttk_TkDestroyedHandler -- ++ * See bug [310c74ecf440]: idle calls to ThemeChangedProc() ++ * need to be canceled when Tk is destroyed, since the interp ++ * may still be active afterward; canceling them from ++ * Ttk_StylePkgFree() would be too late. ++ */ ++void Ttk_TkDestroyedHandler( ++ Tcl_Interp* interp) ++{ ++ StylePackageData* pkgPtr = GetStylePackageData(interp); ++ ++ /* ++ * Cancel any pending ThemeChanged calls: ++ */ ++ if (pkgPtr->themeChangePending) { ++ Tcl_CancelIdleCall(ThemeChangedProc, pkgPtr); ++ } ++} + + /* + * Ttk_CreateTheme -- + * Create a new theme and register it in the global theme table. + * + diff --git a/Mac/BuildScript/backport_gh92603_fix.patch b/Mac/BuildScript/backport_gh92603_fix.patch new file mode 100644 index 00000000000000..9a37b029650340 --- /dev/null +++ b/Mac/BuildScript/backport_gh92603_fix.patch @@ -0,0 +1,82 @@ +Accepted upstream for release in Tk 8.6.14: +https://core.tcl-lang.org/tk/info/cf3830280b + +--- tk8.6.13/macosx/tkMacOSXWindowEvent.c.orig ++++ tk8.6.13-patched/macosx/tkMacOSXWindowEvent.c +@@ -239,8 +239,8 @@ extern NSString *NSWindowDidOrderOffScreenNotification; + if (winPtr) { + TKContentView *view = [window contentView]; + +-#if MAC_OS_X_VERSION_MAX_ALLOWED >= 101500 +- if (@available(macOS 10.15, *)) { ++#if MAC_OS_X_VERSION_MAX_ALLOWED >= 101400 ++ if (@available(macOS 10.14, *)) { + [view viewDidChangeEffectiveAppearance]; + } + #endif +@@ -1237,29 +1237,8 @@ static const char *const accentNames[] = { + } else if (effectiveAppearanceName == NSAppearanceNameDarkAqua) { + TkSendVirtualEvent(tkwin, "DarkAqua", NULL); + } +- if ([NSApp macOSVersion] < 101500) { +- +- /* +- * Mojave cannot handle the KVO shenanigans that we need for the +- * highlight and accent color notifications. +- */ +- +- return; +- } + if (!defaultColor) { + defaultColor = [NSApp macOSVersion] < 110000 ? "Blue" : "Multicolor"; +- preferences = [[NSUserDefaults standardUserDefaults] retain]; +- +- /* +- * AppKit calls this method when the user changes the Accent Color +- * but not when the user changes the Highlight Color. So we register +- * to receive KVO notifications for Highlight Color as well. +- */ +- +- [preferences addObserver:self +- forKeyPath:@"AppleHighlightColor" +- options:NSKeyValueObservingOptionNew +- context:NULL]; + } + NSString *accent = [preferences stringForKey:@"AppleAccentColor"]; + NSArray *words = [[preferences stringForKey:@"AppleHighlightColor"] +--- tk8.6.13/macosx/tkMacOSXWm.c.orig ++++ tk8.6.13-patched/macosx/tkMacOSXWm.c +@@ -1289,6 +1289,11 @@ TkWmDeadWindow( + [NSApp _setMainWindow:nil]; + } + [deadNSWindow close]; ++#if MAC_OS_X_VERSION_MAX_ALLOWED >= 101400 ++ NSUserDefaults *preferences = [NSUserDefaults standardUserDefaults]; ++ [preferences removeObserver:deadNSWindow.contentView ++ forKeyPath:@"AppleHighlightColor"]; ++#endif + [deadNSWindow release]; + + #if DEBUG_ZOMBIES > 1 +@@ -6763,6 +6768,21 @@ TkMacOSXMakeRealWindowExist( + } + TKContentView *contentView = [[TKContentView alloc] + initWithFrame:NSZeroRect]; ++#if MAC_OS_X_VERSION_MAX_ALLOWED >= 101400 ++ NSUserDefaults *preferences = [NSUserDefaults standardUserDefaults]; ++ ++ /* ++ * AppKit calls the viewDidChangeEffectiveAppearance method when the ++ * user changes the Accent Color but not when the user changes the ++ * Highlight Color. So we register to receive KVO notifications for ++ * Highlight Color as well. ++ */ ++ ++ [preferences addObserver:contentView ++ forKeyPath:@"AppleHighlightColor" ++ options:NSKeyValueObservingOptionNew ++ context:NULL]; ++#endif + [window setContentView:contentView]; + [contentView release]; + [window setDelegate:NSApp]; diff --git a/Mac/BuildScript/build-installer.py b/Mac/BuildScript/build-installer.py index f3728e3259e514..69bbde2cf1336b 100755 --- a/Mac/BuildScript/build-installer.py +++ b/Mac/BuildScript/build-installer.py @@ -261,14 +261,14 @@ def library_recipes(): tcl_checksum='81656d3367af032e0ae6157eff134f89' tk_checksum='5e0faecba458ee1386078fb228d008ba' - tk_patches = ['tk868_on_10_8_10_9.patch'] + tk_patches = ['backport_gh71383_fix.patch', 'tk868_on_10_8_10_9.patch', 'backport_gh110950_fix.patch'] else: tcl_tk_ver='8.6.13' tcl_checksum='43a1fae7412f61ff11de2cfd05d28cfc3a73762f354a417c62370a54e2caf066' tk_checksum='2e65fa069a23365440a3c56c556b8673b5e32a283800d8d9b257e3f584ce0675' - tk_patches = [ ] + tk_patches = ['backport_gh92603_fix.patch', 'backport_gh71383_fix.patch', 'backport_gh110950_fix.patch'] result.extend([ diff --git a/Misc/NEWS.d/next/macOS/2023-08-30-16-33-57.gh-issue-92603.ATkKVO.rst b/Misc/NEWS.d/next/macOS/2023-08-30-16-33-57.gh-issue-92603.ATkKVO.rst new file mode 100644 index 00000000000000..477463c0c21264 --- /dev/null +++ b/Misc/NEWS.d/next/macOS/2023-08-30-16-33-57.gh-issue-92603.ATkKVO.rst @@ -0,0 +1,3 @@ +Update macOS installer to include a fix accepted by upstream Tcl/Tk +for a crash encountered after the first :meth:`tkinter.Tk` instance +is destroyed. diff --git a/Misc/NEWS.d/next/macOS/2023-09-02-08-49-57.gh-issue-71383.Ttkchg.rst b/Misc/NEWS.d/next/macOS/2023-09-02-08-49-57.gh-issue-71383.Ttkchg.rst new file mode 100644 index 00000000000000..d8f3e429aab815 --- /dev/null +++ b/Misc/NEWS.d/next/macOS/2023-09-02-08-49-57.gh-issue-71383.Ttkchg.rst @@ -0,0 +1,2 @@ +Update macOS installer to include an upstream Tcl/Tk fix +for the ``ttk::ThemeChanged`` error encountered in Tkinter. diff --git a/Misc/NEWS.d/next/macOS/2023-10-18-17-26-36.gh-issue-110950.sonoma.rst b/Misc/NEWS.d/next/macOS/2023-10-18-17-26-36.gh-issue-110950.sonoma.rst new file mode 100644 index 00000000000000..c678c09f6aa55b --- /dev/null +++ b/Misc/NEWS.d/next/macOS/2023-10-18-17-26-36.gh-issue-110950.sonoma.rst @@ -0,0 +1,3 @@ +Update macOS installer to include an upstream Tcl/Tk fix for the +``Secure coding is not enabled for restorable state!`` warning +encountered in Tkinter on macOS 14 Sonoma. From 640454fd32c314fd92239c571ae620dabc5aacef Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 22 Nov 2023 06:41:51 +0100 Subject: [PATCH 216/265] [3.11] Fix docstring and var name of itertools recipe (GH-112113) (#112311) Fix docstring and var name of itertools recipe (GH-112113) `prepend()` works with arbitrary iterables, not only iterators. In fact, the example given uses a `list`, which is iterable, but not an iterator. (cherry picked from commit 6c47eaccfa2550c140a24bc6e520d968731d9689) Co-authored-by: Sebastian Rittau --- Doc/library/itertools.rst | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/Doc/library/itertools.rst b/Doc/library/itertools.rst index 2196e53502f5a4..24e2684496c075 100644 --- a/Doc/library/itertools.rst +++ b/Doc/library/itertools.rst @@ -758,10 +758,10 @@ which incur interpreter overhead. "Return first n items of the iterable as a list" return list(islice(iterable, n)) - def prepend(value, iterator): - "Prepend a single value in front of an iterator" + def prepend(value, iterable): + "Prepend a single value in front of an iterable" # prepend(1, [2, 3, 4]) --> 1 2 3 4 - return chain([value], iterator) + return chain([value], iterable) def tabulate(function, start=0): "Return function(0), function(1), ..." From d3c12144d34ee2b477708071321afa94a8b40a6f Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 23 Nov 2023 22:21:49 +0100 Subject: [PATCH 217/265] [3.11] Remove bogus annotations from the descriptor howto guide (gh-112349) (gh-112350) --- Doc/howto/descriptor.rst | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/Doc/howto/descriptor.rst b/Doc/howto/descriptor.rst index 0a7263614670a1..ec7d99b440ee44 100644 --- a/Doc/howto/descriptor.rst +++ b/Doc/howto/descriptor.rst @@ -521,11 +521,11 @@ everyday Python programs. Descriptor protocol ------------------- -``descr.__get__(self, obj, type=None) -> value`` +``descr.__get__(self, obj, type=None)`` -``descr.__set__(self, obj, value) -> None`` +``descr.__set__(self, obj, value)`` -``descr.__delete__(self, obj) -> None`` +``descr.__delete__(self, obj)`` That is all there is to it. Define any of these methods and an object is considered a descriptor and can override default behavior upon being looked up From 669b8fab31e0e8edf1f19464a7cfbc187e9dd664 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 24 Nov 2023 19:15:42 +0100 Subject: [PATCH 218/265] [3.11] gh-59254: mention in open() doc that line buffering is for writing (GH-112318) (#112379) gh-59254: mention in open() doc that line buffering is for writing (GH-112318) (cherry picked from commit fafae08cc7caa25f2bd6b29106b50ef76c3e296f) Co-authored-by: Irit Katriel <1055913+iritkatriel@users.noreply.github.com> --- Doc/library/functions.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst index c287f52156ba63..0e5f36f57daf17 100644 --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -1220,7 +1220,7 @@ are always available. They are listed here in alphabetical order. *buffering* is an optional integer used to set the buffering policy. Pass 0 to switch buffering off (only allowed in binary mode), 1 to select line - buffering (only usable in text mode), and an integer > 1 to indicate the size + buffering (only usable when writing in text mode), and an integer > 1 to indicate the size in bytes of a fixed-size chunk buffer. Note that specifying a buffer size this way applies for binary buffered I/O, but ``TextIOWrapper`` (i.e., files opened with ``mode='r+'``) would have another buffering. To disable buffering in From 8b6068e5553f81ad5d1b0be2fe35e1eb53c763ef Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 25 Nov 2023 08:59:28 +0100 Subject: [PATCH 219/265] [3.11] gh-101100: Define `test.regrtest` module to fix references (GH-112381) (#112391) gh-101100: Define `test.regrtest` module to fix references (GH-112381) Define test.regrtest module to fix references (cherry picked from commit d525d01e2794e7e736527eaa7ee309ca1252f5bd) Co-authored-by: Hugo van Kemenade --- Doc/library/test.rst | 3 +++ 1 file changed, 3 insertions(+) diff --git a/Doc/library/test.rst b/Doc/library/test.rst index 7fc2a7c8466f66..bf1a2f361a2888 100644 --- a/Doc/library/test.rst +++ b/Doc/library/test.rst @@ -159,6 +159,9 @@ guidelines to be followed: Running tests using the command-line interface ---------------------------------------------- +.. module:: test.regrtest + :synopsis: Drives the regression test suite. + The :mod:`test` package can be run as a script to drive Python's regression test suite, thanks to the :option:`-m` option: :program:`python -m test`. Under the hood, it uses :mod:`test.regrtest`; the call :program:`python -m From 68db0ed903ee677e0f2684a5598ef2006680cf27 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 25 Nov 2023 09:00:44 +0100 Subject: [PATCH 220/265] [3.11] gh-101100: Define `_tkinter` module to fix references (GH-112382) (#112393) gh-101100: Define `_tkinter` module to fix references (GH-112382) Define _tkinter module to fix references (cherry picked from commit 6b961b8ceaba372b78d03feaceb4837bf7236694) Co-authored-by: Hugo van Kemenade --- Doc/library/tkinter.rst | 3 +++ 1 file changed, 3 insertions(+) diff --git a/Doc/library/tkinter.rst b/Doc/library/tkinter.rst index ee34f2659cf3d8..aa78a0fe93d53d 100644 --- a/Doc/library/tkinter.rst +++ b/Doc/library/tkinter.rst @@ -232,6 +232,9 @@ The modules that provide Tk support include: Additional modules: +.. module:: _tkinter + :synopsis: A binary module that contains the low-level interface to Tcl/Tk. + :mod:`_tkinter` A binary module that contains the low-level interface to Tcl/Tk. It is automatically imported by the main :mod:`tkinter` module, From 8c9f273760c60d1c59696b3e16ead4cbd03392f2 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 25 Nov 2023 18:39:16 +0100 Subject: [PATCH 221/265] [3.11] gh-94722: fix DocTest.__eq__ for case of no line number on one side (GH-112385) (#112401) gh-94722: fix DocTest.__eq__ for case of no line number on one side (GH-112385) (cherry picked from commit fbb9027a037ff1bfaf3f596df033ca45743ee980) Co-authored-by: Irit Katriel <1055913+iritkatriel@users.noreply.github.com> --- Lib/doctest.py | 6 ++++-- Lib/test/test_doctest.py | 17 +++++++++++++++++ ...023-11-24-21-00-24.gh-issue-94722.GMIQIn.rst | 2 ++ 3 files changed, 23 insertions(+), 2 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-11-24-21-00-24.gh-issue-94722.GMIQIn.rst diff --git a/Lib/doctest.py b/Lib/doctest.py index 21abe475ef85f5..5012b24b98a339 100644 --- a/Lib/doctest.py +++ b/Lib/doctest.py @@ -569,9 +569,11 @@ def __hash__(self): def __lt__(self, other): if not isinstance(other, DocTest): return NotImplemented - return ((self.name, self.filename, self.lineno, id(self)) + self_lno = self.lineno if self.lineno is not None else -1 + other_lno = other.lineno if other.lineno is not None else -1 + return ((self.name, self.filename, self_lno, id(self)) < - (other.name, other.filename, other.lineno, id(other))) + (other.name, other.filename, other_lno, id(other))) ###################################################################### ## 3. DocTestParser diff --git a/Lib/test/test_doctest.py b/Lib/test/test_doctest.py index 617529dded3b73..6c0da0b156a5d2 100644 --- a/Lib/test/test_doctest.py +++ b/Lib/test/test_doctest.py @@ -416,6 +416,23 @@ def test_DocTest(): r""" False >>> test != other_test True + >>> test < other_test + False + >>> other_test < test + True + +Test comparison with lineno None on one side + + >>> no_lineno = parser.get_doctest(docstring, globs, 'some_test', + ... 'some_test', None) + >>> test.lineno is None + False + >>> no_lineno.lineno is None + True + >>> test < no_lineno + False + >>> no_lineno < test + True Compare `DocTestCase`: diff --git a/Misc/NEWS.d/next/Library/2023-11-24-21-00-24.gh-issue-94722.GMIQIn.rst b/Misc/NEWS.d/next/Library/2023-11-24-21-00-24.gh-issue-94722.GMIQIn.rst new file mode 100644 index 00000000000000..41bd57f46ed82a --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-11-24-21-00-24.gh-issue-94722.GMIQIn.rst @@ -0,0 +1,2 @@ +Fix bug where comparison between instances of :class:`~doctest.DocTest` fails if +one of them has ``None`` as its lineno. From d9254f9f04a12b1543a91f4fae2a3da17ccfa05b Mon Sep 17 00:00:00 2001 From: Hugo van Kemenade Date: Sat, 25 Nov 2023 21:26:05 +0200 Subject: [PATCH 222/265] [3.11] gh-101100 : Fix Sphinx warnings in `library/doctest.rst` (GH-112399) (#112404) Co-authored-by: Alex Waygood --- Doc/library/doctest.rst | 47 +++++++++++++++++++++++----------------- Doc/library/unittest.rst | 2 ++ Doc/tools/.nitignore | 1 - 3 files changed, 29 insertions(+), 21 deletions(-) diff --git a/Doc/library/doctest.rst b/Doc/library/doctest.rst index d00c4c88d57176..ef7e2b9a47a08c 100644 --- a/Doc/library/doctest.rst +++ b/Doc/library/doctest.rst @@ -143,13 +143,13 @@ Simple Usage: Checking Examples in Docstrings --------------------------------------------- The simplest way to start using doctest (but not necessarily the way you'll -continue to do it) is to end each module :mod:`M` with:: +continue to do it) is to end each module :mod:`!M` with:: if __name__ == "__main__": import doctest doctest.testmod() -:mod:`doctest` then examines docstrings in module :mod:`M`. +:mod:`!doctest` then examines docstrings in module :mod:`!M`. Running the module as a script causes the examples in the docstrings to get executed and verified:: @@ -401,10 +401,10 @@ What's the Execution Context? ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ By default, each time :mod:`doctest` finds a docstring to test, it uses a -*shallow copy* of :mod:`M`'s globals, so that running tests doesn't change the -module's real globals, and so that one test in :mod:`M` can't leave behind +*shallow copy* of :mod:`!M`'s globals, so that running tests doesn't change the +module's real globals, and so that one test in :mod:`!M` can't leave behind crumbs that accidentally allow another test to work. This means examples can -freely use any names defined at top-level in :mod:`M`, and names defined earlier +freely use any names defined at top-level in :mod:`!M`, and names defined earlier in the docstring being run. Examples cannot see names defined in other docstrings. @@ -956,7 +956,8 @@ and :ref:`doctest-simple-testfile`. Optional argument *exclude_empty* defaults to false. If true, objects for which no doctests are found are excluded from consideration. The default is a backward - compatibility hack, so that code still using :meth:`doctest.master.summarize` in + compatibility hack, so that code still using + :meth:`doctest.master.summarize ` in conjunction with :func:`testmod` continues to get output for objects with no tests. The *exclude_empty* argument to the newer :class:`DocTestFinder` constructor defaults to true. @@ -995,7 +996,7 @@ As your collection of doctest'ed modules grows, you'll want a way to run all their doctests systematically. :mod:`doctest` provides two functions that can be used to create :mod:`unittest` test suites from modules and text files containing doctests. To integrate with :mod:`unittest` test discovery, include -a :func:`load_tests` function in your test module:: +a :ref:`load_tests ` function in your test module:: import unittest import doctest @@ -1109,19 +1110,24 @@ from text files and modules with doctests: :func:`DocTestSuite` returns an empty :class:`unittest.TestSuite` if *module* contains no docstrings instead of raising :exc:`ValueError`. +.. exception:: failureException + + When doctests which have been converted to unit tests by :func:`DocFileSuite` + or :func:`DocTestSuite` fail, this exception is raised showing the name of + the file containing the test and a (sometimes approximate) line number. Under the covers, :func:`DocTestSuite` creates a :class:`unittest.TestSuite` out -of :class:`doctest.DocTestCase` instances, and :class:`DocTestCase` is a -subclass of :class:`unittest.TestCase`. :class:`DocTestCase` isn't documented +of :class:`!doctest.DocTestCase` instances, and :class:`!DocTestCase` is a +subclass of :class:`unittest.TestCase`. :class:`!DocTestCase` isn't documented here (it's an internal detail), but studying its code can answer questions about the exact details of :mod:`unittest` integration. Similarly, :func:`DocFileSuite` creates a :class:`unittest.TestSuite` out of -:class:`doctest.DocFileCase` instances, and :class:`DocFileCase` is a subclass -of :class:`DocTestCase`. +:class:`!doctest.DocFileCase` instances, and :class:`!DocFileCase` is a subclass +of :class:`!DocTestCase`. So both ways of creating a :class:`unittest.TestSuite` run instances of -:class:`DocTestCase`. This is important for a subtle reason: when you run +:class:`!DocTestCase`. This is important for a subtle reason: when you run :mod:`doctest` functions yourself, you can control the :mod:`doctest` options in use directly, by passing option flags to :mod:`doctest` functions. However, if you're writing a :mod:`unittest` framework, :mod:`unittest` ultimately controls @@ -1142,14 +1148,14 @@ reporting flags specific to :mod:`unittest` support, via this function: section :ref:`doctest-options`. Only "reporting flags" can be used. This is a module-global setting, and affects all future doctests run by module - :mod:`unittest`: the :meth:`runTest` method of :class:`DocTestCase` looks at - the option flags specified for the test case when the :class:`DocTestCase` + :mod:`unittest`: the :meth:`!runTest` method of :class:`!DocTestCase` looks at + the option flags specified for the test case when the :class:`!DocTestCase` instance was constructed. If no reporting flags were specified (which is the - typical and expected case), :mod:`doctest`'s :mod:`unittest` reporting flags are + typical and expected case), :mod:`!doctest`'s :mod:`unittest` reporting flags are :ref:`bitwise ORed ` into the option flags, and the option flags so augmented are passed to the :class:`DocTestRunner` instance created to run the doctest. If any reporting flags were specified when the - :class:`DocTestCase` instance was constructed, :mod:`doctest`'s + :class:`!DocTestCase` instance was constructed, :mod:`!doctest`'s :mod:`unittest` reporting flags are ignored. The value of the :mod:`unittest` reporting flags in effect before the function @@ -1319,7 +1325,8 @@ Example Objects A dictionary mapping from option flags to ``True`` or ``False``, which is used to override default options for this example. Any option flags not contained in this dictionary are left at their default value (as specified by the - :class:`DocTestRunner`'s :attr:`optionflags`). By default, no options are set. + :class:`DocTestRunner`'s :ref:`optionflags `). + By default, no options are set. .. _doctest-doctestfinder: @@ -1532,7 +1539,7 @@ DocTestRunner objects The output of each example is checked using the :class:`DocTestRunner`'s output checker, and the results are formatted by the - :meth:`DocTestRunner.report_\*` methods. + :meth:`!DocTestRunner.report_\*` methods. .. method:: summarize(verbose=None) @@ -1690,12 +1697,12 @@ code under the debugger: module) of the object with the doctests of interest. The result is a string, containing the object's docstring converted to a Python script, as described for :func:`script_from_examples` above. For example, if module :file:`a.py` - contains a top-level function :func:`f`, then :: + contains a top-level function :func:`!f`, then :: import a, doctest print(doctest.testsource(a, "a.f")) - prints a script version of function :func:`f`'s docstring, with doctests + prints a script version of function :func:`!f`'s docstring, with doctests converted to code, and the rest placed in comments. diff --git a/Doc/library/unittest.rst b/Doc/library/unittest.rst index c5826698bd35f5..48d37c0539a50c 100644 --- a/Doc/library/unittest.rst +++ b/Doc/library/unittest.rst @@ -2332,6 +2332,8 @@ Loading and running tests test names. +.. _load_tests-protocol: + load_tests Protocol ################### diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index e59d6198032314..cb7558ce08e53c 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -52,7 +52,6 @@ Doc/library/datetime.rst Doc/library/dbm.rst Doc/library/decimal.rst Doc/library/dis.rst -Doc/library/doctest.rst Doc/library/email.charset.rst Doc/library/email.compat32-message.rst Doc/library/email.errors.rst From 49005e4d02f49a1512577cb6c3eba2b68dbdd467 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sat, 25 Nov 2023 23:26:22 +0100 Subject: [PATCH 223/265] [3.11] gh-112331: Fix reference manual description of attribute lookup mechanics (gh-112375) (gh-112413) --- Doc/reference/expressions.rst | 18 ++++++++++++------ 1 file changed, 12 insertions(+), 6 deletions(-) diff --git a/Doc/reference/expressions.rst b/Doc/reference/expressions.rst index 9207fd3f3c02ee..7130aa2f6b7b7b 100644 --- a/Doc/reference/expressions.rst +++ b/Doc/reference/expressions.rst @@ -806,12 +806,18 @@ An attribute reference is a primary followed by a period and a name: The primary must evaluate to an object of a type that supports attribute references, which most objects do. This object is then asked to produce the -attribute whose name is the identifier. This production can be customized by -overriding the :meth:`__getattr__` method. If this attribute is not available, -the exception :exc:`AttributeError` is raised. Otherwise, the type and value of -the object produced is determined by the object. Multiple evaluations of the -same attribute reference may yield different objects. - +attribute whose name is the identifier. The type and value produced is +determined by the object. Multiple evaluations of the same attribute +reference may yield different objects. + +This production can be customized by overriding the +:meth:`~object.__getattribute__` method or the :meth:`~object.__getattr__` +method. The :meth:`!__getattribute__` method is called first and either +returns a value or raises :exc:`AttributeError` if the attribute is not +available. + +If an :exc:`AttributeError` is raised and the object has a :meth:`!__getattr__` +method, that method is called as a fallback. .. _subscriptions: From fc657d0c609eff0ca8ff7de22741be6ce11a9bd2 Mon Sep 17 00:00:00 2001 From: Hugo van Kemenade Date: Sun, 26 Nov 2023 14:22:17 +0200 Subject: [PATCH 224/265] [3.11] gh-101100: Fix Sphinx reference warnings (GH-112416) (#112422) --- Doc/c-api/set.rst | 2 +- Doc/extending/newtypes.rst | 4 ++-- Doc/library/asyncio-stream.rst | 4 ++++ Doc/library/email.errors.rst | 9 +++++++++ Doc/library/gzip.rst | 2 +- Doc/library/importlib.resources.rst | 4 ++-- Doc/library/xml.rst | 2 +- Doc/tools/.nitignore | 7 ------- 8 files changed, 20 insertions(+), 14 deletions(-) diff --git a/Doc/c-api/set.rst b/Doc/c-api/set.rst index 09c0fb6b9c5f23..cba823aa027bd6 100644 --- a/Doc/c-api/set.rst +++ b/Doc/c-api/set.rst @@ -147,7 +147,7 @@ subtypes but not for instances of :class:`frozenset` or its subtypes. Return ``1`` if found and removed, ``0`` if not found (no action taken), and ``-1`` if an error is encountered. Does not raise :exc:`KeyError` for missing keys. Raise a - :exc:`TypeError` if the *key* is unhashable. Unlike the Python :meth:`~set.discard` + :exc:`TypeError` if the *key* is unhashable. Unlike the Python :meth:`~frozenset.discard` method, this function does not automatically convert unhashable sets into temporary frozensets. Raise :exc:`SystemError` if *set* is not an instance of :class:`set` or its subtype. diff --git a/Doc/extending/newtypes.rst b/Doc/extending/newtypes.rst index 8ea76aec8290db..dd561e611d18f9 100644 --- a/Doc/extending/newtypes.rst +++ b/Doc/extending/newtypes.rst @@ -321,7 +321,7 @@ An interesting advantage of using the :c:member:`~PyTypeObject.tp_members` table descriptors that are used at runtime is that any attribute defined this way can have an associated doc string simply by providing the text in the table. An application can use the introspection API to retrieve the descriptor from the -class object, and get the doc string using its :attr:`__doc__` attribute. +class object, and get the doc string using its :attr:`!__doc__` attribute. As with the :c:member:`~PyTypeObject.tp_methods` table, a sentinel entry with a :c:member:`~PyMethodDef.ml_name` value of ``NULL`` is required. @@ -473,7 +473,7 @@ instance of your data type. Here is a simple example:: return result; } -:c:type:`Py_hash_t` is a signed integer type with a platform-varying width. +:c:type:`!Py_hash_t` is a signed integer type with a platform-varying width. Returning ``-1`` from :c:member:`~PyTypeObject.tp_hash` indicates an error, which is why you should be careful to avoid returning it when hash computation is successful, as seen above. diff --git a/Doc/library/asyncio-stream.rst b/Doc/library/asyncio-stream.rst index e9d466d95e547e..4f7da27fbd350e 100644 --- a/Doc/library/asyncio-stream.rst +++ b/Doc/library/asyncio-stream.rst @@ -204,6 +204,10 @@ StreamReader directly; use :func:`open_connection` and :func:`start_server` instead. + .. method:: feed_eof() + + Acknowledge the EOF. + .. coroutinemethod:: read(n=-1) Read up to *n* bytes from the stream. diff --git a/Doc/library/email.errors.rst b/Doc/library/email.errors.rst index 194a98696f437d..56aea6598b8615 100644 --- a/Doc/library/email.errors.rst +++ b/Doc/library/email.errors.rst @@ -58,6 +58,15 @@ The following exception classes are defined in the :mod:`email.errors` module: :class:`~email.mime.nonmultipart.MIMENonMultipart` (e.g. :class:`~email.mime.image.MIMEImage`). +.. exception:: MessageDefect() + + This is the base class for all defects found when parsing email messages. + It is derived from :exc:`ValueError`. + +.. exception:: HeaderDefect() + + This is the base class for all defects found when parsing email headers. + It is derived from :exc:`MessageDefect`. Here is the list of the defects that the :class:`~email.parser.FeedParser` can find while parsing messages. Note that the defects are added to the message diff --git a/Doc/library/gzip.rst b/Doc/library/gzip.rst index fdb546236aeadc..f95df41d9f344c 100644 --- a/Doc/library/gzip.rst +++ b/Doc/library/gzip.rst @@ -105,7 +105,7 @@ The module defines the following items: should only be provided in compression mode. If omitted or ``None``, the current time is used. See the :attr:`mtime` attribute for more details. - Calling a :class:`GzipFile` object's :meth:`close` method does not close + Calling a :class:`GzipFile` object's :meth:`!close` method does not close *fileobj*, since you might wish to append more material after the compressed data. This also allows you to pass an :class:`io.BytesIO` object opened for writing as *fileobj*, and retrieve the resulting memory buffer using the diff --git a/Doc/library/importlib.resources.rst b/Doc/library/importlib.resources.rst index 437da6927ab2f4..d338ce07a8a4b6 100644 --- a/Doc/library/importlib.resources.rst +++ b/Doc/library/importlib.resources.rst @@ -41,7 +41,7 @@ for example, a package and its resources can be imported from a zip file using ``get_resource_reader(fullname)`` method as specified by :class:`importlib.resources.abc.ResourceReader`. -.. data:: Package +.. class:: Package Whenever a function accepts a ``Package`` argument, you can pass in either a :class:`module object ` or a module name @@ -58,7 +58,7 @@ for example, a package and its resources can be imported from a zip file using containers (think subdirectories). *package* is either a name or a module object which conforms to the - :data:`Package` requirements. + :class:`Package` requirements. .. versionadded:: 3.9 diff --git a/Doc/library/xml.rst b/Doc/library/xml.rst index 1e49b6568dfc28..909022ea4ba6a4 100644 --- a/Doc/library/xml.rst +++ b/Doc/library/xml.rst @@ -73,7 +73,7 @@ decompression bomb Safe Safe Safe 1. Expat 2.4.1 and newer is not vulnerable to the "billion laughs" and "quadratic blowup" vulnerabilities. Items still listed as vulnerable due to potential reliance on system-provided libraries. Check - :const:`pyexpat.EXPAT_VERSION`. + :const:`!pyexpat.EXPAT_VERSION`. 2. :mod:`xml.etree.ElementTree` doesn't expand external entities and raises a :exc:`~xml.etree.ElementTree.ParseError` when an entity occurs. 3. :mod:`xml.dom.minidom` doesn't expand external entities and simply returns diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index cb7558ce08e53c..8ae1ab62e69540 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -14,14 +14,12 @@ Doc/c-api/memory.rst Doc/c-api/memoryview.rst Doc/c-api/module.rst Doc/c-api/object.rst -Doc/c-api/set.rst Doc/c-api/stable.rst Doc/c-api/structures.rst Doc/c-api/sys.rst Doc/c-api/type.rst Doc/c-api/typeobj.rst Doc/extending/extending.rst -Doc/extending/newtypes.rst Doc/glossary.rst Doc/howto/descriptor.rst Doc/howto/enum.rst @@ -33,7 +31,6 @@ Doc/library/abc.rst Doc/library/ast.rst Doc/library/asyncio-extending.rst Doc/library/asyncio-policy.rst -Doc/library/asyncio-stream.rst Doc/library/asyncio-subprocess.rst Doc/library/asyncio-task.rst Doc/library/bdb.rst @@ -55,7 +52,6 @@ Doc/library/dis.rst Doc/library/email.charset.rst Doc/library/email.compat32-message.rst Doc/library/email.errors.rst -Doc/library/email.headerregistry.rst Doc/library/email.mime.rst Doc/library/email.parser.rst Doc/library/email.policy.rst @@ -67,12 +63,10 @@ Doc/library/ftplib.rst Doc/library/functions.rst Doc/library/functools.rst Doc/library/gettext.rst -Doc/library/gzip.rst Doc/library/http.client.rst Doc/library/http.cookiejar.rst Doc/library/http.cookies.rst Doc/library/http.server.rst -Doc/library/importlib.resources.rst Doc/library/importlib.rst Doc/library/inspect.rst Doc/library/locale.rst @@ -126,7 +120,6 @@ Doc/library/wsgiref.rst Doc/library/xml.dom.minidom.rst Doc/library/xml.dom.pulldom.rst Doc/library/xml.dom.rst -Doc/library/xml.rst Doc/library/xml.sax.handler.rst Doc/library/xml.sax.reader.rst Doc/library/xml.sax.rst From faa1ef4893f0abc0acd7ce3abd393d06ea69374d Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 27 Nov 2023 10:34:35 +0100 Subject: [PATCH 225/265] [3.11] Docs: fix typo in doc for sqlite3.Cursor.execute (GH-112442) (#112445) Docs: fix typo in doc for sqlite3.Cursor.execute (GH-112442) (cherry picked from commit fb79e1ed4a985a487a02bb8585cc1bd2933dfa7c) Co-authored-by: Tom Levy --- Doc/library/sqlite3.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/library/sqlite3.rst b/Doc/library/sqlite3.rst index 3e9b7a270ad5ef..36e9092685434a 100644 --- a/Doc/library/sqlite3.rst +++ b/Doc/library/sqlite3.rst @@ -1314,7 +1314,7 @@ Cursor objects .. method:: execute(sql, parameters=(), /) - Execute SQL a single SQL statement, + Execute a single SQL statement, optionally binding Python values using :ref:`placeholders `. From 62e430af9eca7eddafc1f73dd8f7af289ffdf097 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 27 Nov 2023 16:03:45 +0100 Subject: [PATCH 226/265] [3.11] GH-101100: Fix reference warnings for ``socket`` methods (GH-110114) (#112456) Co-authored-by: Adam Turner <9087854+AA-Turner@users.noreply.github.com> Co-authored-by: Serhiy Storchaka --- Doc/library/socket.rst | 24 ++++++++++++------------ Doc/whatsnew/2.0.rst | 6 +++--- Doc/whatsnew/2.7.rst | 4 ++-- 3 files changed, 17 insertions(+), 17 deletions(-) diff --git a/Doc/library/socket.rst b/Doc/library/socket.rst index b5c1edaf5ad8e0..9cfa0c4f25e2c8 100644 --- a/Doc/library/socket.rst +++ b/Doc/library/socket.rst @@ -23,7 +23,7 @@ all modern Unix systems, Windows, MacOS, and probably additional platforms. The Python interface is a straightforward transliteration of the Unix system call and library interface for sockets to Python's object-oriented style: the -:func:`.socket` function returns a :dfn:`socket object` whose methods implement +:func:`~socket.socket` function returns a :dfn:`socket object` whose methods implement the various socket system calls. Parameter types are somewhat higher-level than in the C interface: as with :meth:`read` and :meth:`write` operations on Python files, buffer allocation on receive operations is automatic, and buffer length @@ -322,7 +322,7 @@ Constants AF_INET6 These constants represent the address (and protocol) families, used for the - first argument to :func:`.socket`. If the :const:`AF_UNIX` constant is not + first argument to :func:`~socket.socket`. If the :const:`AF_UNIX` constant is not defined then this protocol is unsupported. More constants may be available depending on the system. @@ -339,7 +339,7 @@ Constants SOCK_SEQPACKET These constants represent the socket types, used for the second argument to - :func:`.socket`. More constants may be available depending on the system. + :func:`~socket.socket`. More constants may be available depending on the system. (Only :const:`SOCK_STREAM` and :const:`SOCK_DGRAM` appear to be generally useful.) @@ -378,7 +378,7 @@ Constants Many constants of these forms, documented in the Unix documentation on sockets and/or the IP protocol, are also defined in the socket module. They are - generally used in arguments to the :meth:`setsockopt` and :meth:`getsockopt` + generally used in arguments to the :meth:`~socket.setsockopt` and :meth:`~socket.getsockopt` methods of socket objects. In most cases, only those symbols that are defined in the Unix header files are defined; for a few symbols, default values are provided. @@ -671,7 +671,7 @@ The following functions all create :ref:`socket objects `. Build a pair of connected socket objects using the given address family, socket type, and protocol number. Address family, socket type, and protocol number are - as for the :func:`.socket` function above. The default family is :const:`AF_UNIX` + as for the :func:`~socket.socket` function above. The default family is :const:`AF_UNIX` if defined on the platform; otherwise, the default is :const:`AF_INET`. The newly created sockets are :ref:`non-inheritable `. @@ -767,7 +767,7 @@ The following functions all create :ref:`socket objects `. Duplicate the file descriptor *fd* (an integer as returned by a file object's :meth:`~io.IOBase.fileno` method) and build a socket object from the result. Address - family, socket type and protocol number are as for the :func:`.socket` function + family, socket type and protocol number are as for the :func:`~socket.socket` function above. The file descriptor should refer to a socket, but this is not checked --- subsequent operations on the object may fail if the file descriptor is invalid. This function is rarely needed, but can be used to get or set socket options on @@ -832,7 +832,7 @@ The :mod:`socket` module also offers various network-related services: ``(family, type, proto, canonname, sockaddr)`` In these tuples, *family*, *type*, *proto* are all integers and are - meant to be passed to the :func:`.socket` function. *canonname* will be + meant to be passed to the :func:`~socket.socket` function. *canonname* will be a string representing the canonical name of the *host* if :const:`AI_CANONNAME` is part of the *flags* argument; else *canonname* will be empty. *sockaddr* is a tuple describing a socket address, whose @@ -948,7 +948,7 @@ The :mod:`socket` module also offers various network-related services: .. function:: getprotobyname(protocolname) Translate an internet protocol name (for example, ``'icmp'``) to a constant - suitable for passing as the (optional) third argument to the :func:`.socket` + suitable for passing as the (optional) third argument to the :func:`~socket.socket` function. This is usually only needed for sockets opened in "raw" mode (:const:`SOCK_RAW`); for the normal socket modes, the correct protocol is chosen automatically if the protocol is omitted or zero. @@ -1232,7 +1232,7 @@ The :mod:`socket` module also offers various network-related services: Send the list of file descriptors *fds* over an :const:`AF_UNIX` socket *sock*. The *fds* parameter is a sequence of file descriptors. - Consult :meth:`sendmsg` for the documentation of these parameters. + Consult :meth:`~socket.sendmsg` for the documentation of these parameters. .. availability:: Unix, Windows, not Emscripten, not WASI. @@ -1246,7 +1246,7 @@ The :mod:`socket` module also offers various network-related services: Receive up to *maxfds* file descriptors from an :const:`AF_UNIX` socket *sock*. Return ``(msg, list(fds), flags, addr)``. - Consult :meth:`recvmsg` for the documentation of these parameters. + Consult :meth:`~socket.recvmsg` for the documentation of these parameters. .. availability:: Unix, Windows, not Emscripten, not WASI. @@ -1965,10 +1965,10 @@ Example Here are four minimal example programs using the TCP/IP protocol: a server that echoes all data that it receives back (servicing only one client), and a client -using it. Note that a server must perform the sequence :func:`.socket`, +using it. Note that a server must perform the sequence :func:`~socket.socket`, :meth:`~socket.bind`, :meth:`~socket.listen`, :meth:`~socket.accept` (possibly repeating the :meth:`~socket.accept` to service more than one client), while a -client only needs the sequence :func:`.socket`, :meth:`~socket.connect`. Also +client only needs the sequence :func:`~socket.socket`, :meth:`~socket.connect`. Also note that the server does not :meth:`~socket.sendall`/:meth:`~socket.recv` on the socket it is listening on but on the new socket returned by :meth:`~socket.accept`. diff --git a/Doc/whatsnew/2.0.rst b/Doc/whatsnew/2.0.rst index 87cd09bd7d523b..fe7efe8ca01ccd 100644 --- a/Doc/whatsnew/2.0.rst +++ b/Doc/whatsnew/2.0.rst @@ -671,9 +671,9 @@ errors. If you absolutely must use 2.0 but can't fix your code, you can edit ``NO_STRICT_LIST_APPEND`` to preserve the old behaviour; this isn't recommended. Some of the functions in the :mod:`socket` module are still forgiving in this -way. For example, :func:`socket.connect( ('hostname', 25) )` is the correct -form, passing a tuple representing an IP address, but :func:`socket.connect( -'hostname', 25 )` also works. :func:`socket.connect_ex` and :func:`socket.bind` +way. For example, ``socket.connect( ('hostname', 25) )`` is the correct +form, passing a tuple representing an IP address, but ``socket.connect('hostname', 25)`` +also works. :meth:`socket.connect_ex ` and :meth:`socket.bind ` are similarly easy-going. 2.0alpha1 tightened these functions up, but because the documentation actually used the erroneous multiple argument form, many people wrote code which would break with the stricter checking. GvR backed out diff --git a/Doc/whatsnew/2.7.rst b/Doc/whatsnew/2.7.rst index 5979d77da5d6b6..b9d0ab08e6b724 100644 --- a/Doc/whatsnew/2.7.rst +++ b/Doc/whatsnew/2.7.rst @@ -2382,8 +2382,8 @@ Port-Specific Changes: Mac OS X Port-Specific Changes: FreeBSD ----------------------------------- -* FreeBSD 7.1's :const:`SO_SETFIB` constant, used with - :func:`~socket.getsockopt`/:func:`~socket.setsockopt` to select an +* FreeBSD 7.1's :const:`SO_SETFIB` constant, used with the :func:`~socket.socket` methods + :func:`~socket.socket.getsockopt`/:func:`~socket.socket.setsockopt` to select an alternate routing table, is now available in the :mod:`socket` module. (Added by Kyle VanderBeek; :issue:`8235`.) From 6d9b1819b84a0285ed706d11b1f7060ddf141355 Mon Sep 17 00:00:00 2001 From: Serhiy Storchaka Date: Mon, 27 Nov 2023 19:41:05 +0200 Subject: [PATCH 227/265] [3.11] gh-84443: SSLSocket.recv_into() now support buffer protocol with itemsize != 1 (GH-20310) (GH-112459) It is also no longer use __len__(). (cherry picked from commit 812360fddda86d7aff5823f529ab720f57ddc411) Co-authored-by: Zackery Spytz --- Lib/ssl.py | 12 ++++++---- Lib/test/test_ssl.py | 22 +++++++++++++++++++ .../2020-05-21-23-32-46.bpo-40262.z4fQv1.rst | 2 ++ 3 files changed, 32 insertions(+), 4 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2020-05-21-23-32-46.bpo-40262.z4fQv1.rst diff --git a/Lib/ssl.py b/Lib/ssl.py index 48d229f810af28..c0350a81b2f0eb 100644 --- a/Lib/ssl.py +++ b/Lib/ssl.py @@ -1299,10 +1299,14 @@ def recv(self, buflen=1024, flags=0): def recv_into(self, buffer, nbytes=None, flags=0): self._checkClosed() - if buffer and (nbytes is None): - nbytes = len(buffer) - elif nbytes is None: - nbytes = 1024 + if nbytes is None: + if buffer is not None: + with memoryview(buffer) as view: + nbytes = view.nbytes + if not nbytes: + nbytes = 1024 + else: + nbytes = 1024 if self._sslobj is not None: if flags != 0: raise ValueError( diff --git a/Lib/test/test_ssl.py b/Lib/test/test_ssl.py index b353f3265cbb9e..1f881038c329b5 100644 --- a/Lib/test/test_ssl.py +++ b/Lib/test/test_ssl.py @@ -9,6 +9,7 @@ from test.support import socket_helper from test.support import threading_helper from test.support import warnings_helper +import array import re import socket import select @@ -3765,6 +3766,27 @@ def test_recv_zero(self): self.assertEqual(s.recv(0), b"") self.assertEqual(s.recv_into(bytearray()), 0) + def test_recv_into_buffer_protocol_len(self): + server = ThreadedEchoServer(CERTFILE) + self.enterContext(server) + s = socket.create_connection((HOST, server.port)) + self.addCleanup(s.close) + s = test_wrap_socket(s, suppress_ragged_eofs=False) + self.addCleanup(s.close) + + s.send(b"data") + buf = array.array('I', [0, 0]) + self.assertEqual(s.recv_into(buf), 4) + self.assertEqual(bytes(buf)[:4], b"data") + + class B(bytearray): + def __len__(self): + 1/0 + s.send(b"data") + buf = B(6) + self.assertEqual(s.recv_into(buf), 4) + self.assertEqual(bytes(buf), b"data\0\0") + def test_nonblocking_send(self): server = ThreadedEchoServer(CERTFILE, certreqs=ssl.CERT_NONE, diff --git a/Misc/NEWS.d/next/Library/2020-05-21-23-32-46.bpo-40262.z4fQv1.rst b/Misc/NEWS.d/next/Library/2020-05-21-23-32-46.bpo-40262.z4fQv1.rst new file mode 100644 index 00000000000000..c017a1c8df09d8 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2020-05-21-23-32-46.bpo-40262.z4fQv1.rst @@ -0,0 +1,2 @@ +The :meth:`ssl.SSLSocket.recv_into` method no longer requires the *buffer* +argument to implement ``__len__`` and supports buffers with arbitrary item size. From 581b24415525f10ca178d5537e9cbd7558ce938a Mon Sep 17 00:00:00 2001 From: Serhiy Storchaka Date: Mon, 27 Nov 2023 20:11:09 +0200 Subject: [PATCH 228/265] [3.11] gh-112438: Fix support of format units with the "e" prefix in nested tuples in PyArg_Parse (gh-112439) (GH-112461) (cherry picked from commit 4eea1e82369fbf7a795d1956e7a8212a1b58009f) --- Lib/test/test_capi/test_getargs.py | 27 +++++++++++++++++++ ...-11-27-09-44-16.gh-issue-112438.GdNZiI.rst | 2 ++ Python/getargs.c | 2 +- 3 files changed, 30 insertions(+), 1 deletion(-) create mode 100644 Misc/NEWS.d/next/C API/2023-11-27-09-44-16.gh-issue-112438.GdNZiI.rst diff --git a/Lib/test/test_capi/test_getargs.py b/Lib/test/test_capi/test_getargs.py index 27675533d1bb4b..1ce1ec3b41c954 100644 --- a/Lib/test/test_capi/test_getargs.py +++ b/Lib/test/test_capi/test_getargs.py @@ -1322,6 +1322,33 @@ def test_positional_only(self): with self.assertRaisesRegex(SystemError, 'Empty keyword'): parse((1,), {}, 'O|OO', ['', 'a', '']) + def test_nested_tuple(self): + parse = _testcapi.parse_tuple_and_keywords + + parse(((1, 2, 3),), {}, '(OOO)', ['a']) + parse((1, (2, 3), 4), {}, 'O(OO)O', ['a', 'b', 'c']) + parse(((1, 2, 3),), {}, '(iii)', ['a']) + + with self.assertRaisesRegex(TypeError, + "argument 1 must be sequence of length 2, not 3"): + parse(((1, 2, 3),), {}, '(ii)', ['a']) + with self.assertRaisesRegex(TypeError, + "argument 1 must be sequence of length 2, not 1"): + parse(((1,),), {}, '(ii)', ['a']) + with self.assertRaisesRegex(TypeError, + "argument 1 must be 2-item sequence, not int"): + parse((1,), {}, '(ii)', ['a']) + with self.assertRaisesRegex(TypeError, + "argument 1 must be 2-item sequence, not bytes"): + parse((b'ab',), {}, '(ii)', ['a']) + + for f in 'es', 'et', 'es#', 'et#': + with self.assertRaises(LookupError): # empty encoding "" + parse((('a',),), {}, '(' + f + ')', ['a']) + with self.assertRaisesRegex(TypeError, + "argument 1 must be sequence of length 1, not 0"): + parse(((),), {}, '(' + f + ')', ['a']) + class Test_testcapi(unittest.TestCase): locals().update((name, getattr(_testcapi, name)) diff --git a/Misc/NEWS.d/next/C API/2023-11-27-09-44-16.gh-issue-112438.GdNZiI.rst b/Misc/NEWS.d/next/C API/2023-11-27-09-44-16.gh-issue-112438.GdNZiI.rst new file mode 100644 index 00000000000000..113119efd6aebb --- /dev/null +++ b/Misc/NEWS.d/next/C API/2023-11-27-09-44-16.gh-issue-112438.GdNZiI.rst @@ -0,0 +1,2 @@ +Fix support of format units "es", "et", "es#", and "et#" in nested tuples in +:c:func:`PyArg_ParseTuple`-like functions. diff --git a/Python/getargs.c b/Python/getargs.c index 3105bd556c1bf1..e18d7719929161 100644 --- a/Python/getargs.c +++ b/Python/getargs.c @@ -522,7 +522,7 @@ converttuple(PyObject *arg, const char **p_format, va_list *p_va, int flags, } else if (c == ':' || c == ';' || c == '\0') break; - else if (level == 0 && Py_ISALPHA(c)) + else if (level == 0 && Py_ISALPHA(c) && c != 'e') n++; } From 054d18e883c23d7bca71714c0573c7f32f3a760b Mon Sep 17 00:00:00 2001 From: Serhiy Storchaka Date: Mon, 27 Nov 2023 20:46:34 +0200 Subject: [PATCH 229/265] [3.11] bpo-41422: Visit the Pickler's and Unpickler's memo in tp_traverse (GH-21664) (GH-112465) (cherry picked from commit 967f2a3052c2d22e31564b428a9aa8cc63dc2a9f) Co-authored-by: kale-smoothie <34165060+kale-smoothie@users.noreply.github.com> --- .../2020-07-28-20-48-05.bpo-41422.iMwnMu.rst | 2 ++ Modules/_pickle.c | 14 ++++++++++++++ 2 files changed, 16 insertions(+) create mode 100644 Misc/NEWS.d/next/Library/2020-07-28-20-48-05.bpo-41422.iMwnMu.rst diff --git a/Misc/NEWS.d/next/Library/2020-07-28-20-48-05.bpo-41422.iMwnMu.rst b/Misc/NEWS.d/next/Library/2020-07-28-20-48-05.bpo-41422.iMwnMu.rst new file mode 100644 index 00000000000000..8bde68f8f2afc8 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2020-07-28-20-48-05.bpo-41422.iMwnMu.rst @@ -0,0 +1,2 @@ +Fixed memory leaks of :class:`pickle.Pickler` and :class:`pickle.Unpickler` involving cyclic references via the +internal memo mapping. diff --git a/Modules/_pickle.c b/Modules/_pickle.c index 840877e2db643b..f0cb302184e00b 100644 --- a/Modules/_pickle.c +++ b/Modules/_pickle.c @@ -4691,6 +4691,13 @@ Pickler_traverse(PicklerObject *self, visitproc visit, void *arg) Py_VISIT(self->fast_memo); Py_VISIT(self->reducer_override); Py_VISIT(self->buffer_callback); + PyMemoTable *memo = self->memo; + if (memo && memo->mt_table) { + Py_ssize_t i = memo->mt_allocated; + while (--i >= 0) { + Py_VISIT(memo->mt_table[i].me_key); + } + } return 0; } @@ -7215,6 +7222,13 @@ Unpickler_traverse(UnpicklerObject *self, visitproc visit, void *arg) Py_VISIT(self->stack); Py_VISIT(self->pers_func); Py_VISIT(self->buffers); + PyObject **memo = self->memo; + if (memo) { + Py_ssize_t i = self->memo_size; + while (--i >= 0) { + Py_VISIT(memo[i]); + } + } return 0; } From 43b081bfc49173405576b8eba09a6ca86aac641b Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 27 Nov 2023 19:56:27 +0100 Subject: [PATCH 230/265] [3.11] gh-112388: Fix an error that was causing the parser to try to overwrite tokenizer errors (GH-112410) (#112467) gh-112388: Fix an error that was causing the parser to try to overwrite tokenizer errors (GH-112410) (cherry picked from commit 2c8b19174274c183eb652932871f60570123fe99) Signed-off-by: Pablo Galindo Co-authored-by: Pablo Galindo Salgado --- Lib/test/test_syntax.py | 1 + .../2023-11-25-22-58-49.gh-issue-112388.MU3cIM.rst | 2 ++ Parser/pegen_errors.c | 4 ++++ 3 files changed, 7 insertions(+) create mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-58-49.gh-issue-112388.MU3cIM.rst diff --git a/Lib/test/test_syntax.py b/Lib/test/test_syntax.py index 07049087cdacc0..bbd22eccb3e5b1 100644 --- a/Lib/test/test_syntax.py +++ b/Lib/test/test_syntax.py @@ -2152,6 +2152,7 @@ def test_error_string_literal(self): def test_invisible_characters(self): self._check_error('print\x17("Hello")', "invalid non-printable character") + self._check_error(b"with(0,,):\n\x01", "invalid non-printable character") def test_match_call_does_not_raise_syntax_error(self): code = """ diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-58-49.gh-issue-112388.MU3cIM.rst b/Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-58-49.gh-issue-112388.MU3cIM.rst new file mode 100644 index 00000000000000..1c82be2febda4f --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-58-49.gh-issue-112388.MU3cIM.rst @@ -0,0 +1,2 @@ +Fix an error that was causing the parser to try to overwrite tokenizer +errors. Patch by pablo Galindo diff --git a/Parser/pegen_errors.c b/Parser/pegen_errors.c index 005356110a4340..29d5a6e179f307 100644 --- a/Parser/pegen_errors.c +++ b/Parser/pegen_errors.c @@ -212,6 +212,10 @@ _PyPegen_tokenize_full_source_to_check_for_errors(Parser *p) { void * _PyPegen_raise_error(Parser *p, PyObject *errtype, const char *errmsg, ...) { + // Bail out if we already have an error set. + if (p->error_indicator && PyErr_Occurred()) { + return NULL; + } if (p->fill == 0) { va_list va; va_start(va, errmsg); From 390a5b81a97a36d5166a62e7850fadfe9eba3cc3 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 27 Nov 2023 20:05:20 +0100 Subject: [PATCH 231/265] [3.11] gh-112387: Fix error positions for decoded strings with backwards tokenize errors (GH-112409) (#112469) gh-112387: Fix error positions for decoded strings with backwards tokenize errors (GH-112409) (cherry picked from commit 45d648597b1146431bf3d91041e60d7f040e70bf) Signed-off-by: Pablo Galindo Co-authored-by: Pablo Galindo Salgado --- Lib/test/test_syntax.py | 4 ++++ .../2023-11-25-22-39-44.gh-issue-112387.AbBq5W.rst | 2 ++ Parser/pegen_errors.c | 4 ++++ 3 files changed, 10 insertions(+) create mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-39-44.gh-issue-112387.AbBq5W.rst diff --git a/Lib/test/test_syntax.py b/Lib/test/test_syntax.py index bbd22eccb3e5b1..88d117f641816d 100644 --- a/Lib/test/test_syntax.py +++ b/Lib/test/test_syntax.py @@ -2143,6 +2143,10 @@ def test_error_parenthesis(self): """ self._check_error(code, "parenthesis '\\)' does not match opening parenthesis '\\['") + # Examples with dencodings + s = b'# coding=latin\n(aaaaaaaaaaaaaaaaa\naaaaaaaaaaa\xb5' + self._check_error(s, "'\(' was never closed") + def test_error_string_literal(self): self._check_error("'blech", "unterminated string literal") diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-39-44.gh-issue-112387.AbBq5W.rst b/Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-39-44.gh-issue-112387.AbBq5W.rst new file mode 100644 index 00000000000000..adac11bf4c90a1 --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-39-44.gh-issue-112387.AbBq5W.rst @@ -0,0 +1,2 @@ +Fix error positions for decoded strings with backwards tokenize errors. +Patch by Pablo Galindo diff --git a/Parser/pegen_errors.c b/Parser/pegen_errors.c index 29d5a6e179f307..fb9fa2909717f9 100644 --- a/Parser/pegen_errors.c +++ b/Parser/pegen_errors.c @@ -270,6 +270,10 @@ get_error_line_from_tokenizer_buffers(Parser *p, Py_ssize_t lineno) Py_ssize_t relative_lineno = p->starting_lineno ? lineno - p->starting_lineno + 1 : lineno; const char* buf_end = p->tok->fp_interactive ? p->tok->interactive_src_end : p->tok->inp; + if (buf_end < cur_line) { + buf_end = cur_line + strlen(cur_line); + } + for (int i = 0; i < relative_lineno - 1; i++) { char *new_line = strchr(cur_line, '\n'); // The assert is here for debug builds but the conditional that From c3e5d0d936acb988e547fef2f188fce379694d8c Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 27 Nov 2023 21:14:30 +0100 Subject: [PATCH 232/265] [3.11] gh-68166: Tkinter: Add tests and examples for element_create() (GH-111453) (GH-111858) * Remove mention of "vsapi" element type from the documentation. * Add tests for element_create() and other ttk.Style methods. * Add examples for element_create() in the documentation. (cherry picked from commit 005d1e8fc81539c60c6b21ebba34de3edd5bb232) Co-authored-by: Serhiy Storchaka --- Doc/library/tkinter.ttk.rst | 18 +- Lib/tkinter/test/test_ttk/test_style.py | 184 +++++++++++++++++- ...3-10-27-12-46-56.gh-issue-68166.0EbWW4.rst | 4 + 3 files changed, 203 insertions(+), 3 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-10-27-12-46-56.gh-issue-68166.0EbWW4.rst diff --git a/Doc/library/tkinter.ttk.rst b/Doc/library/tkinter.ttk.rst index dc31a1a4c1850a..5fab1454944989 100644 --- a/Doc/library/tkinter.ttk.rst +++ b/Doc/library/tkinter.ttk.rst @@ -1391,8 +1391,7 @@ option. If you don't know the class name of a widget, use the method .. method:: element_create(elementname, etype, *args, **kw) Create a new element in the current theme, of the given *etype* which is - expected to be either "image", "from" or "vsapi". The latter is only - available in Tk 8.6a for Windows XP and Vista and is not described here. + expected to be either "image" or "from". If "image" is used, *args* should contain the default image name followed by statespec/value pairs (this is the imagespec), and *kw* may have the @@ -1418,6 +1417,16 @@ option. If you don't know the class name of a widget, use the method Specifies a minimum width for the element. If less than zero, the base image's width is used as a default. + Example:: + + img1 = tkinter.PhotoImage(master=root, file='button.png') + img1 = tkinter.PhotoImage(master=root, file='button-pressed.png') + img1 = tkinter.PhotoImage(master=root, file='button-active.png') + style = ttk.Style(root) + style.element_create('Button.button', 'image', + img1, ('pressed', img2), ('active', img3), + border=(2, 4), sticky='we') + If "from" is used as the value of *etype*, :meth:`element_create` will clone an existing element. *args* is expected to contain a themename, from which @@ -1425,6 +1434,11 @@ option. If you don't know the class name of a widget, use the method If this element to clone from is not specified, an empty element will be used. *kw* is discarded. + Example:: + + style = ttk.Style(root) + style.element_create('plain.background', 'from', 'default') + .. method:: element_names() diff --git a/Lib/tkinter/test/test_ttk/test_style.py b/Lib/tkinter/test/test_ttk/test_style.py index f94adc41f4df8b..edef294a96a7ae 100644 --- a/Lib/tkinter/test/test_ttk/test_style.py +++ b/Lib/tkinter/test/test_ttk/test_style.py @@ -2,6 +2,7 @@ import sys import tkinter from tkinter import ttk +from tkinter import TclError from test import support from test.support import requires from tkinter.test.support import AbstractTkTest, get_tk_patchlevel @@ -122,7 +123,6 @@ def test_theme_use(self): self.style.theme_use(curr_theme) - def test_configure_custom_copy(self): style = self.style @@ -176,6 +176,188 @@ def test_map_custom_copy(self): for key, value in default.items(): self.assertEqual(style.map(newname, key), value) + def test_element_options(self): + style = self.style + element_names = style.element_names() + self.assertNotIsInstance(element_names, str) + for name in element_names: + self.assertIsInstance(name, str) + element_options = style.element_options(name) + self.assertNotIsInstance(element_options, str) + for optname in element_options: + self.assertIsInstance(optname, str) + + def test_element_create_errors(self): + style = self.style + with self.assertRaises(TypeError): + style.element_create('plain.newelem') + with self.assertRaisesRegex(TclError, 'No such element type spam'): + style.element_create('plain.newelem', 'spam') + + def test_element_create_from(self): + style = self.style + style.element_create('plain.background', 'from', 'default') + self.assertIn('plain.background', style.element_names()) + style.element_create('plain.arrow', 'from', 'default', 'rightarrow') + self.assertIn('plain.arrow', style.element_names()) + + def test_element_create_from_errors(self): + style = self.style + with self.assertRaises(IndexError): + style.element_create('plain.newelem', 'from') + with self.assertRaisesRegex(TclError, 'theme "spam" doesn\'t exist'): + style.element_create('plain.newelem', 'from', 'spam') + + def test_element_create_image(self): + style = self.style + image = tkinter.PhotoImage(master=self.root, width=12, height=10) + style.element_create('block', 'image', image) + self.assertIn('block', style.element_names()) + + style.layout('TestLabel1', [('block', {'sticky': 'news'})]) + a = ttk.Label(self.root, style='TestLabel1') + a.pack(expand=True, fill='both') + self.assertEqual(a.winfo_reqwidth(), 12) + self.assertEqual(a.winfo_reqheight(), 10) + + imgfile = support.findfile('python.xbm', subdir='imghdrdata') + img1 = tkinter.BitmapImage(master=self.root, file=imgfile, + foreground='yellow', background='blue') + img2 = tkinter.BitmapImage(master=self.root, file=imgfile, + foreground='blue', background='yellow') + img3 = tkinter.BitmapImage(master=self.root, file=imgfile, + foreground='white', background='black') + style.element_create('Button.button', 'image', + img1, ('pressed', img2), ('active', img3), + border=(2, 4), sticky='we') + self.assertIn('Button.button', style.element_names()) + + style.layout('Button', [('Button.button', {'sticky': 'news'})]) + b = ttk.Button(self.root, style='Button') + b.pack(expand=True, fill='both') + self.assertEqual(b.winfo_reqwidth(), 16) + self.assertEqual(b.winfo_reqheight(), 16) + + def test_element_create_image_errors(self): + style = self.style + image = tkinter.PhotoImage(master=self.root, width=10, height=10) + with self.assertRaises(IndexError): + style.element_create('block2', 'image') + with self.assertRaises(TypeError): + style.element_create('block2', 'image', image, 1) + with self.assertRaises(ValueError): + style.element_create('block2', 'image', image, ()) + with self.assertRaisesRegex(TclError, 'Invalid state name'): + style.element_create('block2', 'image', image, ('spam', image)) + with self.assertRaisesRegex(TclError, 'Invalid state name'): + style.element_create('block2', 'image', image, (1, image)) + with self.assertRaises(TypeError): + style.element_create('block2', 'image', image, ('pressed', 1, image)) + with self.assertRaises(TypeError): + style.element_create('block2', 'image', image, (1, 'selected', image)) + with self.assertRaisesRegex(TclError, 'bad option'): + style.element_create('block2', 'image', image, spam=1) + + def test_theme_create(self): + style = self.style + curr_theme = style.theme_use() + curr_layout = style.layout('TLabel') + style.theme_create('testtheme1') + self.assertIn('testtheme1', style.theme_names()) + + style.theme_create('testtheme2', settings={ + 'elem' : {'element create': ['from', 'default'],}, + 'TLabel' : { + 'configure': {'padding': 10}, + 'layout': [('elem', {'sticky': 'we'})], + }, + }) + self.assertIn('testtheme2', style.theme_names()) + + style.theme_create('testtheme3', 'testtheme2') + self.assertIn('testtheme3', style.theme_names()) + + style.theme_use('testtheme1') + self.assertEqual(style.element_names(), ()) + self.assertEqual(style.layout('TLabel'), curr_layout) + + style.theme_use('testtheme2') + self.assertEqual(style.element_names(), ('elem',)) + self.assertEqual(style.lookup('TLabel', 'padding'), '10') + self.assertEqual(style.layout('TLabel'), [('elem', {'sticky': 'we'})]) + + style.theme_use('testtheme3') + self.assertEqual(style.element_names(), ()) + self.assertEqual(style.lookup('TLabel', 'padding'), '') + self.assertEqual(style.layout('TLabel'), [('elem', {'sticky': 'we'})]) + + style.theme_use(curr_theme) + + def test_theme_create_image(self): + style = self.style + curr_theme = style.theme_use() + image = tkinter.PhotoImage(master=self.root, width=10, height=10) + new_theme = 'testtheme4' + style.theme_create(new_theme, settings={ + 'block' : { + 'element create': ['image', image, {'width': 120, 'height': 100}], + }, + 'TestWidget.block2' : { + 'element create': ['image', image], + }, + 'TestWidget' : { + 'configure': { + 'anchor': 'left', + 'padding': (3, 0, 0, 2), + 'foreground': 'yellow', + }, + 'map': { + 'foreground': [ + ('pressed', 'red'), + ('active', 'disabled', 'blue'), + ], + }, + 'layout': [ + ('TestWidget.block', {'sticky': 'we', 'side': 'left'}), + ('TestWidget.border', { + 'sticky': 'nsw', + 'border': 1, + 'children': [ + ('TestWidget.block2', {'sticky': 'nswe'}) + ] + }) + ], + }, + }) + + style.theme_use(new_theme) + self.assertIn('block', style.element_names()) + self.assertEqual(style.lookup('TestWidget', 'anchor'), 'left') + self.assertEqual(style.lookup('TestWidget', 'padding'), '3 0 0 2') + self.assertEqual(style.lookup('TestWidget', 'foreground'), 'yellow') + self.assertEqual(style.lookup('TestWidget', 'foreground', + ['active']), 'yellow') + self.assertEqual(style.lookup('TestWidget', 'foreground', + ['active', 'pressed']), 'red') + self.assertEqual(style.lookup('TestWidget', 'foreground', + ['active', 'disabled']), 'blue') + self.assertEqual(style.layout('TestWidget'), + [ + ('TestWidget.block', {'side': 'left', 'sticky': 'we'}), + ('TestWidget.border', { + 'sticky': 'nsw', + 'border': '1', + 'children': [('TestWidget.block2', {'sticky': 'nswe'})] + }) + ]) + + b = ttk.Label(self.root, style='TestWidget') + b.pack(expand=True, fill='both') + self.assertEqual(b.winfo_reqwidth(), 134) + self.assertEqual(b.winfo_reqheight(), 100) + + style.theme_use(curr_theme) + if __name__ == "__main__": unittest.main() diff --git a/Misc/NEWS.d/next/Library/2023-10-27-12-46-56.gh-issue-68166.0EbWW4.rst b/Misc/NEWS.d/next/Library/2023-10-27-12-46-56.gh-issue-68166.0EbWW4.rst new file mode 100644 index 00000000000000..757a7004cc1dc0 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-10-27-12-46-56.gh-issue-68166.0EbWW4.rst @@ -0,0 +1,4 @@ +Remove mention of not supported "vsapi" element type in +:meth:`tkinter.ttk.Style.element_create`. Add tests for ``element_create()`` +and other ``ttk.Style`` methods. Add examples for ``element_create()`` in +the documentation. From 03b522d3ef8cfc2735134b486bb2581237907b57 Mon Sep 17 00:00:00 2001 From: "Gregory P. Smith" Date: Mon, 27 Nov 2023 15:54:21 -0800 Subject: [PATCH 233/265] [3.11] Backport PR #112477: correct socket AF_PACKET docs (#112478) Backport PR #112477: correct socket AF_PACKET docs Network byte order is not involved in the `int` on the Python side. That happens under the hood. Correctly use the term addresses instead of packets. --- Doc/library/socket.rst | 5 ++--- 1 file changed, 2 insertions(+), 3 deletions(-) diff --git a/Doc/library/socket.rst b/Doc/library/socket.rst index 9cfa0c4f25e2c8..b9f47ae2e6d3f4 100644 --- a/Doc/library/socket.rst +++ b/Doc/library/socket.rst @@ -185,12 +185,11 @@ created. Socket addresses are represented as follows: .. versionadded:: 3.7 - :const:`AF_PACKET` is a low-level interface directly to network devices. - The packets are represented by the tuple + The addresses are represented by the tuple ``(ifname, proto[, pkttype[, hatype[, addr]]])`` where: - *ifname* - String specifying the device name. - - *proto* - An in network-byte-order integer specifying the Ethernet - protocol number. + - *proto* - An integer specifying the Ethernet protocol number. - *pkttype* - Optional integer specifying the packet type: - ``PACKET_HOST`` (the default) - Packet addressed to the local host. From b85070ceaf99acf1b8707764fe5214506a8e7389 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 28 Nov 2023 01:21:18 +0100 Subject: [PATCH 234/265] [3.11] Docs: fix markup for `importlib.machinery.NamespaceLoader` (GH-112479) (#112482) Docs: fix markup for `importlib.machinery.NamespaceLoader` (GH-112479) (cherry picked from commit 2e632fa07d13a58be62f59be4e656ad58b378f9b) Co-authored-by: Alex Waygood --- Doc/library/importlib.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/library/importlib.rst b/Doc/library/importlib.rst index 386a47b05f93da..5978945fcd20f4 100644 --- a/Doc/library/importlib.rst +++ b/Doc/library/importlib.rst @@ -1106,7 +1106,7 @@ find and load modules. .. versionadded:: 3.4 -.. class:: NamespaceLoader(name, path, path_finder): +.. class:: NamespaceLoader(name, path, path_finder) A concrete implementation of :class:`importlib.abc.InspectLoader` for namespace packages. This is an alias for a private class and is only made From 42dd2613fe4bc61e1f633078560f2d84a0a16c3f Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 28 Nov 2023 07:39:47 +0100 Subject: [PATCH 235/265] [3.11] gh-112105: Make completer delims work on libedit (gh-112106) (gh-112488) gh-112105: Make completer delims work on libedit (gh-112106) (cherry picked from commit 2df26d83486b8f9ac6b7df2a9a4669508aa61983) Co-authored-by: Tian Gao --- Lib/test/test_readline.py | 20 +++++++++++++++++++ ...-11-15-04-53-37.gh-issue-112105.I3RcVN.rst | 1 + Modules/readline.c | 16 +++++++++++++++ 3 files changed, 37 insertions(+) create mode 100644 Misc/NEWS.d/next/Library/2023-11-15-04-53-37.gh-issue-112105.I3RcVN.rst diff --git a/Lib/test/test_readline.py b/Lib/test/test_readline.py index 835280f2281cde..6c2726d3209ecf 100644 --- a/Lib/test/test_readline.py +++ b/Lib/test/test_readline.py @@ -5,6 +5,7 @@ import os import sys import tempfile +import textwrap import unittest from test.support import verbose from test.support.import_helper import import_module @@ -163,6 +164,25 @@ def test_auto_history_disabled(self): # end, so don't expect it in the output. self.assertIn(b"History length: 0", output) + def test_set_complete_delims(self): + script = textwrap.dedent(""" + import readline + def complete(text, state): + if state == 0 and text == "$": + return "$complete" + return None + if "libedit" in getattr(readline, "__doc__", ""): + readline.parse_and_bind(r'bind "\\t" rl_complete') + else: + readline.parse_and_bind(r'"\\t": complete') + readline.set_completer_delims(" \\t\\n") + readline.set_completer(complete) + print(input()) + """) + + output = run_pty(script, input=b"$\t\n") + self.assertIn(b"$complete", output) + def test_nonascii(self): loc = locale.setlocale(locale.LC_CTYPE, None) if loc in ('C', 'POSIX'): diff --git a/Misc/NEWS.d/next/Library/2023-11-15-04-53-37.gh-issue-112105.I3RcVN.rst b/Misc/NEWS.d/next/Library/2023-11-15-04-53-37.gh-issue-112105.I3RcVN.rst new file mode 100644 index 00000000000000..4243dcb190434f --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-11-15-04-53-37.gh-issue-112105.I3RcVN.rst @@ -0,0 +1 @@ +Make :func:`readline.set_completer_delims` work with libedit diff --git a/Modules/readline.c b/Modules/readline.c index 8c7f526d418f82..1e13a0e6e06452 100644 --- a/Modules/readline.c +++ b/Modules/readline.c @@ -572,6 +572,13 @@ readline_set_completer_delims(PyObject *module, PyObject *string) if (break_chars) { free(completer_word_break_characters); completer_word_break_characters = break_chars; +#ifdef WITH_EDITLINE + rl_basic_word_break_characters = break_chars; +#else + if (using_libedit_emulation) { + rl_basic_word_break_characters = break_chars; + } +#endif rl_completer_word_break_characters = break_chars; Py_RETURN_NONE; } @@ -1260,6 +1267,15 @@ setup_readline(readlinestate *mod_state) completer_word_break_characters = strdup(" \t\n`~!@#$%^&*()-=+[{]}\\|;:'\",<>/?"); /* All nonalphanums except '.' */ +#ifdef WITH_EDITLINE + // libedit uses rl_basic_word_break_characters instead of + // rl_completer_word_break_characters as complete delimiter + rl_basic_word_break_characters = completer_word_break_characters; +#else + if (using_libedit_emulation) { + rl_basic_word_break_characters = completer_word_break_characters; + } +#endif rl_completer_word_break_characters = completer_word_break_characters; mod_state->begidx = PyLong_FromLong(0L); From 3c0d963379773890d03173915f8bf43c3a81aed1 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Tue, 28 Nov 2023 10:43:38 +0100 Subject: [PATCH 236/265] [3.11] gh-112431: Unconditionally call `hash -r` (GH-112432) (GH-112492) gh-112431: Unconditionally call `hash -r` (GH-112432) The `activate` script calls `hash -r` in two places to make sure the shell picks up the environment changes the script makes. Before that, it checks to see if the shell running the script is bash or zsh. `hash -r` is specified by POSIX and is not exclusive to bash and zsh. This guard prevents the script from calling `hash -r` in other `GH-!/bin/sh`-compatible shells like dash. (cherry picked from commit a194938f33a71e727e53490815bae874eece1460) Co-authored-by: James Morris <6653392+J-M0@users.noreply.github.com> --- Lib/venv/scripts/common/activate | 14 ++++---------- 1 file changed, 4 insertions(+), 10 deletions(-) diff --git a/Lib/venv/scripts/common/activate b/Lib/venv/scripts/common/activate index 6fbc2b8801da04..982da08163b12d 100644 --- a/Lib/venv/scripts/common/activate +++ b/Lib/venv/scripts/common/activate @@ -14,12 +14,9 @@ deactivate () { unset _OLD_VIRTUAL_PYTHONHOME fi - # This should detect bash and zsh, which have a hash command that must - # be called to get it to forget past commands. Without forgetting + # Call hash to forget past commands. Without forgetting # past commands the $PATH changes we made may not be respected - if [ -n "${BASH:-}" -o -n "${ZSH_VERSION:-}" ] ; then - hash -r 2> /dev/null - fi + hash -r 2> /dev/null if [ -n "${_OLD_VIRTUAL_PS1:-}" ] ; then PS1="${_OLD_VIRTUAL_PS1:-}" @@ -61,9 +58,6 @@ if [ -z "${VIRTUAL_ENV_DISABLE_PROMPT:-}" ] ; then export VIRTUAL_ENV_PROMPT fi -# This should detect bash and zsh, which have a hash command that must -# be called to get it to forget past commands. Without forgetting +# Call hash to forget past commands. Without forgetting # past commands the $PATH changes we made may not be respected -if [ -n "${BASH:-}" -o -n "${ZSH_VERSION:-}" ] ; then - hash -r 2> /dev/null -fi +hash -r 2> /dev/null From 0a0cf45f1bae94b605e99b4f6581f120d75ae7ad Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 29 Nov 2023 16:56:20 +0100 Subject: [PATCH 237/265] [3.11] gh-110930: Correct book title by Alan D. Moore (GH-112490) (#112524) Co-authored-by: Hugo van Kemenade --- Doc/library/tkinter.rst | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/Doc/library/tkinter.rst b/Doc/library/tkinter.rst index aa78a0fe93d53d..5703b1fc050b1f 100644 --- a/Doc/library/tkinter.rst +++ b/Doc/library/tkinter.rst @@ -58,8 +58,8 @@ details that are unchanged. * `Modern Tkinter for Busy Python Developers `_ By Mark Roseman. (ISBN 978-1999149567) - * `Python and Tkinter Programming `_ - By Alan Moore. (ISBN 978-1788835886) + * `Python GUI programming with Tkinter `_ + By Alan D. Moore. (ISBN 978-1788835886) * `Programming Python `_ By Mark Lutz; has excellent coverage of Tkinter. (ISBN 978-0596158101) From f6edb83fe97f9e0a20a56e18a11a688229d52c76 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Wed, 29 Nov 2023 18:51:00 +0100 Subject: [PATCH 238/265] [3.11] gh-112509: Fix keys being present in both required_keys and optional_keys in TypedDict (GH-112512) (#112531) (cherry picked from commit 403886942376210662610627b01fea6acd77d331) Co-authored-by: Jelle Zijlstra Co-authored-by: Alex Waygood --- Lib/test/test_typing.py | 40 +++++++++++++++++++ Lib/typing.py | 25 +++++++++--- ...-11-28-20-01-33.gh-issue-112509.QtoKed.rst | 3 ++ 3 files changed, 63 insertions(+), 5 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-11-28-20-01-33.gh-issue-112509.QtoKed.rst diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index 12f78b580effaa..f663fbd8679e0e 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -6805,6 +6805,46 @@ class Cat(Animal): 'voice': str, }) + def test_keys_inheritance_with_same_name(self): + class NotTotal(TypedDict, total=False): + a: int + + class Total(NotTotal): + a: int + + self.assertEqual(NotTotal.__required_keys__, frozenset()) + self.assertEqual(NotTotal.__optional_keys__, frozenset(['a'])) + self.assertEqual(Total.__required_keys__, frozenset(['a'])) + self.assertEqual(Total.__optional_keys__, frozenset()) + + class Base(TypedDict): + a: NotRequired[int] + b: Required[int] + + class Child(Base): + a: Required[int] + b: NotRequired[int] + + self.assertEqual(Base.__required_keys__, frozenset(['b'])) + self.assertEqual(Base.__optional_keys__, frozenset(['a'])) + self.assertEqual(Child.__required_keys__, frozenset(['a'])) + self.assertEqual(Child.__optional_keys__, frozenset(['b'])) + + def test_multiple_inheritance_with_same_key(self): + class Base1(TypedDict): + a: NotRequired[int] + + class Base2(TypedDict): + a: Required[str] + + class Child(Base1, Base2): + pass + + # Last base wins + self.assertEqual(Child.__annotations__, {'a': Required[str]}) + self.assertEqual(Child.__required_keys__, frozenset(['a'])) + self.assertEqual(Child.__optional_keys__, frozenset()) + def test_required_notrequired_keys(self): self.assertEqual(NontotalMovie.__required_keys__, frozenset({"title"})) diff --git a/Lib/typing.py b/Lib/typing.py index fa812f9bd2355a..95f855b5a88452 100644 --- a/Lib/typing.py +++ b/Lib/typing.py @@ -2987,8 +2987,14 @@ def __new__(cls, name, bases, ns, total=True): for base in bases: annotations.update(base.__dict__.get('__annotations__', {})) - required_keys.update(base.__dict__.get('__required_keys__', ())) - optional_keys.update(base.__dict__.get('__optional_keys__', ())) + + base_required = base.__dict__.get('__required_keys__', set()) + required_keys |= base_required + optional_keys -= base_required + + base_optional = base.__dict__.get('__optional_keys__', set()) + required_keys -= base_optional + optional_keys |= base_optional annotations.update(own_annotations) for annotation_key, annotation_type in own_annotations.items(): @@ -3000,14 +3006,23 @@ def __new__(cls, name, bases, ns, total=True): annotation_origin = get_origin(annotation_type) if annotation_origin is Required: - required_keys.add(annotation_key) + is_required = True elif annotation_origin is NotRequired: - optional_keys.add(annotation_key) - elif total: + is_required = False + else: + is_required = total + + if is_required: required_keys.add(annotation_key) + optional_keys.discard(annotation_key) else: optional_keys.add(annotation_key) + required_keys.discard(annotation_key) + assert required_keys.isdisjoint(optional_keys), ( + f"Required keys overlap with optional keys in {name}:" + f" {required_keys=}, {optional_keys=}" + ) tp_dict.__annotations__ = annotations tp_dict.__required_keys__ = frozenset(required_keys) tp_dict.__optional_keys__ = frozenset(optional_keys) diff --git a/Misc/NEWS.d/next/Library/2023-11-28-20-01-33.gh-issue-112509.QtoKed.rst b/Misc/NEWS.d/next/Library/2023-11-28-20-01-33.gh-issue-112509.QtoKed.rst new file mode 100644 index 00000000000000..a16d67e7776bcb --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-11-28-20-01-33.gh-issue-112509.QtoKed.rst @@ -0,0 +1,3 @@ +Fix edge cases that could cause a key to be present in both the +``__required_keys__`` and ``__optional_keys__`` attributes of a +:class:`typing.TypedDict`. Patch by Jelle Zijlstra. From 03ac9e6ee1aacb1ab42788c40e8465a6e5c46532 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Thu, 30 Nov 2023 09:29:23 +0100 Subject: [PATCH 239/265] [3.11] IDLE: fix config_key htest (GH-112545) (#112547) Change 'Dialog' to 'Window' in two places to match the name of the config_key class being tested. (cherry picked from commit 81261fa67ff82b03c255733b0d1abbbb8a228187) Co-authored-by: Terry Jan Reedy --- Lib/idlelib/config_key.py | 2 +- Lib/idlelib/idle_test/htest.py | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/Lib/idlelib/config_key.py b/Lib/idlelib/config_key.py index bb07231cd590b6..e5f67e8d4069ee 100644 --- a/Lib/idlelib/config_key.py +++ b/Lib/idlelib/config_key.py @@ -351,4 +351,4 @@ def cancel(self, event=None): main('idlelib.idle_test.test_config_key', verbosity=2, exit=False) from idlelib.idle_test.htest import run - run(GetKeysDialog) + run(GetKeysWindow) diff --git a/Lib/idlelib/idle_test/htest.py b/Lib/idlelib/idle_test/htest.py index d297f8aa0094ee..595e51d3699f6e 100644 --- a/Lib/idlelib/idle_test/htest.py +++ b/Lib/idlelib/idle_test/htest.py @@ -152,7 +152,7 @@ def _wrapper(parent): # htest # "Best to close editor first." } -GetKeysDialog_spec = { +GetKeysWindow_spec = { 'file': 'config_key', 'kwds': {'title': 'Test keybindings', 'action': 'find-again', From ddf6cfd4564de7242cea1fe940e9396aec6ebaac Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 1 Dec 2023 08:22:07 +0100 Subject: [PATCH 240/265] [3.11] gh-66819: More IDLE htest updates (GH-112574) (#112576) Revise htest.py docstring and move 2 specs to alphabetical position. (cherry picked from commit e44f1940bd6d2ba4a3f8ac4585b3cf4f9cb93e49) Co-authored-by: Terry Jan Reedy --- Lib/idlelib/idle_test/htest.py | 107 ++++++++++++++++----------------- 1 file changed, 52 insertions(+), 55 deletions(-) diff --git a/Lib/idlelib/idle_test/htest.py b/Lib/idlelib/idle_test/htest.py index 595e51d3699f6e..e21ab98d8aab89 100644 --- a/Lib/idlelib/idle_test/htest.py +++ b/Lib/idlelib/idle_test/htest.py @@ -1,38 +1,35 @@ -'''Run human tests of Idle's window, dialog, and popup widgets. - -run(*tests) -Create a master Tk window. Within that, run each callable in tests -after finding the matching test spec in this file. If tests is empty, -run an htest for each spec dict in this file after finding the matching -callable in the module named in the spec. Close the window to skip or -end the test. - -In a tested module, let X be a global name bound to a callable (class -or function) whose .__name__ attribute is also X (the usual situation). -The first parameter of X must be 'parent'. When called, the parent -argument will be the root window. X must create a child Toplevel -window (or subclass thereof). The Toplevel may be a test widget or -dialog, in which case the callable is the corresponding class. Or the -Toplevel may contain the widget to be tested or set up a context in -which a test widget is invoked. In this latter case, the callable is a -wrapper function that sets up the Toplevel and other objects. Wrapper -function names, such as _editor_window', should start with '_'. +"""Run human tests of Idle's window, dialog, and popup widgets. + +run(*tests) Create a master Tk() htest window. Within that, run each +callable in tests after finding the matching test spec in this file. If +tests is empty, run an htest for each spec dict in this file after +finding the matching callable in the module named in the spec. Close +the master window to end testing. + +In a tested module, let X be a global name bound to a callable (class or +function) whose .__name__ attribute is also X (the usual situation). The +first parameter of X must be 'parent'. When called, the parent argument +will be the root window. X must create a child Toplevel(parent) window +(or subclass thereof). The Toplevel may be a test widget or dialog, in +which case the callable is the corresponding class. Or the Toplevel may +contain the widget to be tested or set up a context in which a test +widget is invoked. In this latter case, the callable is a wrapper +function that sets up the Toplevel and other objects. Wrapper function +names, such as _editor_window', should start with '_' and be lowercase. End the module with if __name__ == '__main__': - + from idlelib.idle_test.htest import run - run(X) + run(callable) # There could be multiple comma-separated callables. -To have wrapper functions and test invocation code ignored by coveragepy -reports, put '# htest #' on the def statement header line. - -def _wrapper(parent): # htest # - -Also make sure that the 'if __name__' line matches the above. Then have -make sure that .coveragerc includes the following. +To have wrapper functions ignored by coverage reports, tag the def +header like so: "def _wrapper(parent): # htest #". Use the same tag +for htest lines in widget code. Make sure that the 'if __name__' line +matches the above. Then have make sure that .coveragerc includes the +following: [report] exclude_lines = @@ -46,7 +43,7 @@ def _wrapper(parent): # htest # following template, with X.__name__ prepended to '_spec'. When all tests are run, the prefix is use to get X. -_spec = { +callable_spec = { 'file': '', 'kwds': {'title': ''}, 'msg': "" @@ -54,16 +51,16 @@ def _wrapper(parent): # htest # file (no .py): run() imports file.py. kwds: augmented with {'parent':root} and passed to X as **kwds. -title: an example kwd; some widgets need this, delete if not. +title: an example kwd; some widgets need this, delete line if not. msg: master window hints about testing the widget. -Modules and classes not being tested at the moment: -pyshell.PyShellEditorWindow -debugger.Debugger -autocomplete_w.AutoCompleteWindow -outwin.OutputWindow (indirectly being tested with grep test) -''' +TODO test these modules and classes: + autocomplete_w.AutoCompleteWindow + debugger.Debugger + outwin.OutputWindow (indirectly being tested with grep test) + pyshell.PyShellEditorWindow +""" import idlelib.pyshell # Set Windows DPI awareness before Tk(). from importlib import import_module @@ -91,15 +88,6 @@ def _wrapper(parent): # htest # "Force-open-calltip does not work here.\n" } -_module_browser_spec = { - 'file': 'browser', - 'kwds': {}, - 'msg': "Inspect names of module, class(with superclass if " - "applicable), methods and functions.\nToggle nested items.\n" - "Double clicking on items prints a traceback for an exception " - "that is ignored." - } - _color_delegator_spec = { 'file': 'colorizer', 'kwds': {}, @@ -109,16 +97,6 @@ def _wrapper(parent): # htest # "The default color scheme is in idlelib/config-highlight.def" } -CustomRun_spec = { - 'file': 'query', - 'kwds': {'title': 'Customize query.py Run', - '_htest': True}, - 'msg': "Enter with or [Run]. Print valid entry to Shell\n" - "Arguments are parsed into a list\n" - "Mode is currently restart True or False\n" - "Close dialog with valid entry, , [Cancel], [X]" - } - ConfigDialog_spec = { 'file': 'configdialog', 'kwds': {'title': 'ConfigDialogTest', @@ -135,6 +113,16 @@ def _wrapper(parent): # htest # "changes made have persisted." } +CustomRun_spec = { + 'file': 'query', + 'kwds': {'title': 'Customize query.py Run', + '_htest': True}, + 'msg': "Enter with or [Run]. Print valid entry to Shell\n" + "Arguments are parsed into a list\n" + "Mode is currently restart True or False\n" + "Close dialog with valid entry, , [Cancel], [X]" + } + # TODO Improve message _dyn_option_menu_spec = { 'file': 'dynoption', @@ -236,6 +224,15 @@ def _wrapper(parent): # htest # "focusing out of the window\nare sequences to be tested." } +_module_browser_spec = { + 'file': 'browser', + 'kwds': {}, + 'msg': "Inspect names of module, class(with superclass if " + "applicable), methods and functions.\nToggle nested items.\n" + "Double clicking on items prints a traceback for an exception " + "that is ignored." + } + _multistatus_bar_spec = { 'file': 'statusbar', 'kwds': {}, From fc11280a028bf225ab9790c8ca3f80509b256510 Mon Sep 17 00:00:00 2001 From: Serhiy Storchaka Date: Fri, 1 Dec 2023 10:17:20 +0200 Subject: [PATCH 241/265] [3.11] gh-104231: Add more tests for str(), repr(), ascii(), and bytes() (GH-112551) (GH-112556) (cherry picked from commit 2223899adce858a09ebeaaf82111e6cda9b42415) --- Lib/test/test_bytes.py | 81 ++++++++++++++++++++++++++---------- Lib/test/test_unicode.py | 90 +++++++++++++++++++++++++++------------- 2 files changed, 121 insertions(+), 50 deletions(-) diff --git a/Lib/test/test_bytes.py b/Lib/test/test_bytes.py index 53ba1ad6f5911f..f5605ca3c2c53b 100644 --- a/Lib/test/test_bytes.py +++ b/Lib/test/test_bytes.py @@ -46,6 +46,10 @@ def __index__(self): class BaseBytesTest: + def assertTypedEqual(self, actual, expected): + self.assertIs(type(actual), type(expected)) + self.assertEqual(actual, expected) + def test_basics(self): b = self.type2test() self.assertEqual(type(b), self.type2test) @@ -1023,36 +1027,63 @@ def test_buffer_is_readonly(self): self.assertRaises(TypeError, f.readinto, b"") def test_custom(self): - class A: - def __bytes__(self): - return b'abc' - self.assertEqual(bytes(A()), b'abc') - class A: pass - self.assertRaises(TypeError, bytes, A()) - class A: - def __bytes__(self): - return None - self.assertRaises(TypeError, bytes, A()) - class A: + self.assertEqual(bytes(BytesSubclass(b'abc')), b'abc') + self.assertEqual(BytesSubclass(OtherBytesSubclass(b'abc')), + BytesSubclass(b'abc')) + self.assertEqual(bytes(WithBytes(b'abc')), b'abc') + self.assertEqual(BytesSubclass(WithBytes(b'abc')), BytesSubclass(b'abc')) + + class NoBytes: pass + self.assertRaises(TypeError, bytes, NoBytes()) + self.assertRaises(TypeError, bytes, WithBytes('abc')) + self.assertRaises(TypeError, bytes, WithBytes(None)) + class IndexWithBytes: def __bytes__(self): return b'a' def __index__(self): return 42 - self.assertEqual(bytes(A()), b'a') + self.assertEqual(bytes(IndexWithBytes()), b'a') # Issue #25766 - class A(str): + class StrWithBytes(str): + def __new__(cls, value): + self = str.__new__(cls, '\u20ac') + self.value = value + return self def __bytes__(self): - return b'abc' - self.assertEqual(bytes(A('\u20ac')), b'abc') - self.assertEqual(bytes(A('\u20ac'), 'iso8859-15'), b'\xa4') + return self.value + self.assertEqual(bytes(StrWithBytes(b'abc')), b'abc') + self.assertEqual(bytes(StrWithBytes(b'abc'), 'iso8859-15'), b'\xa4') + self.assertEqual(bytes(StrWithBytes(BytesSubclass(b'abc'))), b'abc') + self.assertEqual(BytesSubclass(StrWithBytes(b'abc')), BytesSubclass(b'abc')) + self.assertEqual(BytesSubclass(StrWithBytes(b'abc'), 'iso8859-15'), + BytesSubclass(b'\xa4')) + self.assertEqual(BytesSubclass(StrWithBytes(BytesSubclass(b'abc'))), + BytesSubclass(b'abc')) + self.assertEqual(BytesSubclass(StrWithBytes(OtherBytesSubclass(b'abc'))), + BytesSubclass(b'abc')) # Issue #24731 - class A: + self.assertTypedEqual(bytes(WithBytes(BytesSubclass(b'abc'))), BytesSubclass(b'abc')) + self.assertTypedEqual(BytesSubclass(WithBytes(BytesSubclass(b'abc'))), + BytesSubclass(b'abc')) + self.assertTypedEqual(BytesSubclass(WithBytes(OtherBytesSubclass(b'abc'))), + BytesSubclass(b'abc')) + + class BytesWithBytes(bytes): + def __new__(cls, value): + self = bytes.__new__(cls, b'\xa4') + self.value = value + return self def __bytes__(self): - return OtherBytesSubclass(b'abc') - self.assertEqual(bytes(A()), b'abc') - self.assertIs(type(bytes(A())), OtherBytesSubclass) - self.assertEqual(BytesSubclass(A()), b'abc') - self.assertIs(type(BytesSubclass(A())), BytesSubclass) + return self.value + self.assertTypedEqual(bytes(BytesWithBytes(b'abc')), b'abc') + self.assertTypedEqual(BytesSubclass(BytesWithBytes(b'abc')), + BytesSubclass(b'abc')) + self.assertTypedEqual(bytes(BytesWithBytes(BytesSubclass(b'abc'))), + BytesSubclass(b'abc')) + self.assertTypedEqual(BytesSubclass(BytesWithBytes(BytesSubclass(b'abc'))), + BytesSubclass(b'abc')) + self.assertTypedEqual(BytesSubclass(BytesWithBytes(OtherBytesSubclass(b'abc'))), + BytesSubclass(b'abc')) # Test PyBytes_FromFormat() def test_from_format(self): @@ -2040,6 +2071,12 @@ class BytesSubclass(bytes): class OtherBytesSubclass(bytes): pass +class WithBytes: + def __init__(self, value): + self.value = value + def __bytes__(self): + return self.value + class ByteArraySubclassTest(SubclassTest, unittest.TestCase): basetype = bytearray type2test = ByteArraySubclass diff --git a/Lib/test/test_unicode.py b/Lib/test/test_unicode.py index c265438a86d516..985b3661328781 100644 --- a/Lib/test/test_unicode.py +++ b/Lib/test/test_unicode.py @@ -55,6 +55,21 @@ def duplicate_string(text): class StrSubclass(str): pass +class OtherStrSubclass(str): + pass + +class WithStr: + def __init__(self, value): + self.value = value + def __str__(self): + return self.value + +class WithRepr: + def __init__(self, value): + self.value = value + def __repr__(self): + return self.value + class UnicodeTest(string_tests.CommonTest, string_tests.MixinStrUnicodeUserStringTest, string_tests.MixinStrUnicodeTest, @@ -84,6 +99,10 @@ def __repr__(self): self.assertEqual(realresult, result) self.assertTrue(object is not realresult) + def assertTypedEqual(self, actual, expected): + self.assertIs(type(actual), type(expected)) + self.assertEqual(actual, expected) + def test_literals(self): self.assertEqual('\xff', '\u00ff') self.assertEqual('\uffff', '\U0000ffff') @@ -130,10 +149,13 @@ def test_ascii(self): self.assertEqual(ascii("\U00010000" * 39 + "\uffff" * 4096), ascii("\U00010000" * 39 + "\uffff" * 4096)) - class WrongRepr: - def __repr__(self): - return b'byte-repr' - self.assertRaises(TypeError, ascii, WrongRepr()) + self.assertTypedEqual(ascii('\U0001f40d'), r"'\U0001f40d'") + self.assertTypedEqual(ascii(StrSubclass('abc')), "'abc'") + self.assertTypedEqual(ascii(WithRepr('')), '') + self.assertTypedEqual(ascii(WithRepr(StrSubclass(''))), StrSubclass('')) + self.assertTypedEqual(ascii(WithRepr('<\U0001f40d>')), r'<\U0001f40d>') + self.assertTypedEqual(ascii(WithRepr(StrSubclass('<\U0001f40d>'))), r'<\U0001f40d>') + self.assertRaises(TypeError, ascii, WithRepr(b'byte-repr')) def test_repr(self): if not sys.platform.startswith('java'): @@ -172,10 +194,13 @@ def test_repr(self): self.assertEqual(repr("\U00010000" * 39 + "\uffff" * 4096), repr("\U00010000" * 39 + "\uffff" * 4096)) - class WrongRepr: - def __repr__(self): - return b'byte-repr' - self.assertRaises(TypeError, repr, WrongRepr()) + self.assertTypedEqual(repr('\U0001f40d'), "'\U0001f40d'") + self.assertTypedEqual(repr(StrSubclass('abc')), "'abc'") + self.assertTypedEqual(repr(WithRepr('')), '') + self.assertTypedEqual(repr(WithRepr(StrSubclass(''))), StrSubclass('')) + self.assertTypedEqual(repr(WithRepr('<\U0001f40d>')), '<\U0001f40d>') + self.assertTypedEqual(repr(WithRepr(StrSubclass('<\U0001f40d>'))), StrSubclass('<\U0001f40d>')) + self.assertRaises(TypeError, repr, WithRepr(b'byte-repr')) def test_iterators(self): # Make sure unicode objects have an __iter__ method @@ -2381,28 +2406,37 @@ def test_ucs4(self): def test_conversion(self): # Make sure __str__() works properly - class ObjectToStr: - def __str__(self): - return "foo" - - class StrSubclassToStr(str): - def __str__(self): - return "foo" - - class StrSubclassToStrSubclass(str): - def __new__(cls, content=""): - return str.__new__(cls, 2*content) - def __str__(self): + class StrWithStr(str): + def __new__(cls, value): + self = str.__new__(cls, "") + self.value = value return self + def __str__(self): + return self.value - self.assertEqual(str(ObjectToStr()), "foo") - self.assertEqual(str(StrSubclassToStr("bar")), "foo") - s = str(StrSubclassToStrSubclass("foo")) - self.assertEqual(s, "foofoo") - self.assertIs(type(s), StrSubclassToStrSubclass) - s = StrSubclass(StrSubclassToStrSubclass("foo")) - self.assertEqual(s, "foofoo") - self.assertIs(type(s), StrSubclass) + self.assertTypedEqual(str(WithStr('abc')), 'abc') + self.assertTypedEqual(str(WithStr(StrSubclass('abc'))), StrSubclass('abc')) + self.assertTypedEqual(StrSubclass(WithStr('abc')), StrSubclass('abc')) + self.assertTypedEqual(StrSubclass(WithStr(StrSubclass('abc'))), + StrSubclass('abc')) + self.assertTypedEqual(StrSubclass(WithStr(OtherStrSubclass('abc'))), + StrSubclass('abc')) + + self.assertTypedEqual(str(StrWithStr('abc')), 'abc') + self.assertTypedEqual(str(StrWithStr(StrSubclass('abc'))), StrSubclass('abc')) + self.assertTypedEqual(StrSubclass(StrWithStr('abc')), StrSubclass('abc')) + self.assertTypedEqual(StrSubclass(StrWithStr(StrSubclass('abc'))), + StrSubclass('abc')) + self.assertTypedEqual(StrSubclass(StrWithStr(OtherStrSubclass('abc'))), + StrSubclass('abc')) + + self.assertTypedEqual(str(WithRepr('')), '') + self.assertTypedEqual(str(WithRepr(StrSubclass(''))), StrSubclass('')) + self.assertTypedEqual(StrSubclass(WithRepr('')), StrSubclass('')) + self.assertTypedEqual(StrSubclass(WithRepr(StrSubclass(''))), + StrSubclass('')) + self.assertTypedEqual(StrSubclass(WithRepr(OtherStrSubclass(''))), + StrSubclass('')) def test_unicode_repr(self): class s1: From d838c7aa8aa85e8a470cf11211bcbe59897e32d2 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 1 Dec 2023 15:18:09 +0100 Subject: [PATCH 242/265] [3.11] gh-82565: Add tests for pickle and unpickle with bad files (GH-16606) (GH-112592) (cherry picked from commit 058444308abee79bb1b3358883adfa8c97bd043a) Co-authored-by: Zackery Spytz --- Lib/test/pickletester.py | 78 ++++++++++++++++++++++++++++++++++++++++ 1 file changed, 78 insertions(+) diff --git a/Lib/test/pickletester.py b/Lib/test/pickletester.py index a687fe0629080a..5d5df1658c4ee2 100644 --- a/Lib/test/pickletester.py +++ b/Lib/test/pickletester.py @@ -3483,6 +3483,84 @@ def __init__(self): pass self.assertRaises(pickle.PicklingError, BadPickler().dump, 0) self.assertRaises(pickle.UnpicklingError, BadUnpickler().load) + def test_unpickler_bad_file(self): + # bpo-38384: Crash in _pickle if the read attribute raises an error. + def raises_oserror(self, *args, **kwargs): + raise OSError + @property + def bad_property(self): + 1/0 + + # File without read and readline + class F: + pass + self.assertRaises((AttributeError, TypeError), self.Unpickler, F()) + + # File without read + class F: + readline = raises_oserror + self.assertRaises((AttributeError, TypeError), self.Unpickler, F()) + + # File without readline + class F: + read = raises_oserror + self.assertRaises((AttributeError, TypeError), self.Unpickler, F()) + + # File with bad read + class F: + read = bad_property + readline = raises_oserror + self.assertRaises(ZeroDivisionError, self.Unpickler, F()) + + # File with bad readline + class F: + readline = bad_property + read = raises_oserror + self.assertRaises(ZeroDivisionError, self.Unpickler, F()) + + # File with bad readline, no read + class F: + readline = bad_property + self.assertRaises(ZeroDivisionError, self.Unpickler, F()) + + # File with bad read, no readline + class F: + read = bad_property + self.assertRaises((AttributeError, ZeroDivisionError), self.Unpickler, F()) + + # File with bad peek + class F: + peek = bad_property + read = raises_oserror + readline = raises_oserror + try: + self.Unpickler(F()) + except ZeroDivisionError: + pass + + # File with bad readinto + class F: + readinto = bad_property + read = raises_oserror + readline = raises_oserror + try: + self.Unpickler(F()) + except ZeroDivisionError: + pass + + def test_pickler_bad_file(self): + # File without write + class F: + pass + self.assertRaises(TypeError, self.Pickler, F()) + + # File with bad write + class F: + @property + def write(self): + 1/0 + self.assertRaises(ZeroDivisionError, self.Pickler, F()) + def check_dumps_loads_oob_buffers(self, dumps, loads): # No need to do the full gamut of tests here, just enough to # check that dumps() and loads() redirect their arguments From eb3c0dc66760d24fc64eaddcc705519ed621f84b Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 1 Dec 2023 15:43:09 +0100 Subject: [PATCH 243/265] [3.11] gh-109413: regrtest: add WorkerRunTests class (GH-112588) (#112594) gh-109413: regrtest: add WorkerRunTests class (GH-112588) (cherry picked from commit f8ff80f63536e96b004d29112452a8f1738fde37) Co-authored-by: Victor Stinner --- Lib/test/libregrtest/main.py | 1 - Lib/test/libregrtest/run_workers.py | 12 +++++------ Lib/test/libregrtest/runtests.py | 31 +++++++++++++++++++---------- Lib/test/libregrtest/worker.py | 6 +++--- 4 files changed, 28 insertions(+), 22 deletions(-) diff --git a/Lib/test/libregrtest/main.py b/Lib/test/libregrtest/main.py index 8544bb484c8be0..beee0fa09500e1 100644 --- a/Lib/test/libregrtest/main.py +++ b/Lib/test/libregrtest/main.py @@ -418,7 +418,6 @@ def create_run_tests(self, tests: TestTuple): python_cmd=self.python_cmd, randomize=self.randomize, random_seed=self.random_seed, - json_file=None, ) def _run_tests(self, selected: TestTuple, tests: TestList | None) -> int: diff --git a/Lib/test/libregrtest/run_workers.py b/Lib/test/libregrtest/run_workers.py index ab03cb54d6122e..25ca8623c687c5 100644 --- a/Lib/test/libregrtest/run_workers.py +++ b/Lib/test/libregrtest/run_workers.py @@ -18,7 +18,7 @@ from .logger import Logger from .result import TestResult, State from .results import TestResults -from .runtests import RunTests, JsonFile, JsonFileType +from .runtests import RunTests, WorkerRunTests, JsonFile, JsonFileType from .single import PROGRESS_MIN_TIME from .utils import ( StrPath, TestName, @@ -162,7 +162,7 @@ def stop(self) -> None: self._stopped = True self._kill() - def _run_process(self, runtests: RunTests, output_fd: int, + def _run_process(self, runtests: WorkerRunTests, output_fd: int, tmp_dir: StrPath | None = None) -> int | None: popen = create_worker_process(runtests, output_fd, tmp_dir) self._popen = popen @@ -250,9 +250,7 @@ def create_json_file(self, stack: contextlib.ExitStack) -> tuple[JsonFile, TextI json_file = JsonFile(json_fd, JsonFileType.UNIX_FD) return (json_file, json_tmpfile) - def create_worker_runtests(self, test_name: TestName, json_file: JsonFile) -> RunTests: - """Create the worker RunTests.""" - + def create_worker_runtests(self, test_name: TestName, json_file: JsonFile) -> WorkerRunTests: tests = (test_name,) if self.runtests.rerun: match_tests = self.runtests.get_match_tests(test_name) @@ -265,12 +263,12 @@ def create_worker_runtests(self, test_name: TestName, json_file: JsonFile) -> Ru if self.runtests.output_on_failure: kwargs['verbose'] = True kwargs['output_on_failure'] = False - return self.runtests.copy( + return self.runtests.create_worker_runtests( tests=tests, json_file=json_file, **kwargs) - def run_tmp_files(self, worker_runtests: RunTests, + def run_tmp_files(self, worker_runtests: WorkerRunTests, stdout_fd: int) -> tuple[int | None, list[StrPath]]: # gh-93353: Check for leaked temporary files in the parent process, # since the deletion of temporary files can happen late during diff --git a/Lib/test/libregrtest/runtests.py b/Lib/test/libregrtest/runtests.py index bfed1b4a2a5817..283c104c1afb37 100644 --- a/Lib/test/libregrtest/runtests.py +++ b/Lib/test/libregrtest/runtests.py @@ -91,13 +91,17 @@ class RunTests: python_cmd: tuple[str, ...] | None randomize: bool random_seed: int | str - json_file: JsonFile | None - def copy(self, **override): + def copy(self, **override) -> 'RunTests': state = dataclasses.asdict(self) state.update(override) return RunTests(**state) + def create_worker_runtests(self, **override): + state = dataclasses.asdict(self) + state.update(override) + return WorkerRunTests(**state) + def get_match_tests(self, test_name) -> FilterTuple | None: if self.match_tests_dict is not None: return self.match_tests_dict.get(test_name, None) @@ -118,13 +122,6 @@ def iter_tests(self): else: yield from self.tests - def as_json(self) -> StrJSON: - return json.dumps(self, cls=_EncodeRunTests) - - @staticmethod - def from_json(worker_json: StrJSON) -> 'RunTests': - return json.loads(worker_json, object_hook=_decode_runtests) - def json_file_use_stdout(self) -> bool: # Use STDOUT in two cases: # @@ -139,9 +136,21 @@ def json_file_use_stdout(self) -> bool: ) +@dataclasses.dataclass(slots=True, frozen=True) +class WorkerRunTests(RunTests): + json_file: JsonFile + + def as_json(self) -> StrJSON: + return json.dumps(self, cls=_EncodeRunTests) + + @staticmethod + def from_json(worker_json: StrJSON) -> 'WorkerRunTests': + return json.loads(worker_json, object_hook=_decode_runtests) + + class _EncodeRunTests(json.JSONEncoder): def default(self, o: Any) -> dict[str, Any]: - if isinstance(o, RunTests): + if isinstance(o, WorkerRunTests): result = dataclasses.asdict(o) result["__runtests__"] = True return result @@ -156,6 +165,6 @@ def _decode_runtests(data: dict[str, Any]) -> RunTests | dict[str, Any]: data['hunt_refleak'] = HuntRefleak(**data['hunt_refleak']) if data['json_file']: data['json_file'] = JsonFile(**data['json_file']) - return RunTests(**data) + return WorkerRunTests(**data) else: return data diff --git a/Lib/test/libregrtest/worker.py b/Lib/test/libregrtest/worker.py index 2eccfabc25223a..581741f2b1e92f 100644 --- a/Lib/test/libregrtest/worker.py +++ b/Lib/test/libregrtest/worker.py @@ -7,7 +7,7 @@ from test.support import os_helper from .setup import setup_process, setup_test_dir -from .runtests import RunTests, JsonFile, JsonFileType +from .runtests import WorkerRunTests, JsonFile, JsonFileType from .single import run_single_test from .utils import ( StrPath, StrJSON, TestFilter, @@ -17,7 +17,7 @@ USE_PROCESS_GROUP = (hasattr(os, "setsid") and hasattr(os, "killpg")) -def create_worker_process(runtests: RunTests, output_fd: int, +def create_worker_process(runtests: WorkerRunTests, output_fd: int, tmp_dir: StrPath | None = None) -> subprocess.Popen: python_cmd = runtests.python_cmd worker_json = runtests.as_json() @@ -71,7 +71,7 @@ def create_worker_process(runtests: RunTests, output_fd: int, def worker_process(worker_json: StrJSON) -> NoReturn: - runtests = RunTests.from_json(worker_json) + runtests = WorkerRunTests.from_json(worker_json) test_name = runtests.tests[0] match_tests: TestFilter = runtests.match_tests json_file: JsonFile = runtests.json_file From 0443f926becc38480fe60529ad9049ccceb635bd Mon Sep 17 00:00:00 2001 From: Alexey Izbyshev Date: Fri, 1 Dec 2023 18:44:03 +0300 Subject: [PATCH 244/265] [3.11] bpo-35191: Fix unexpected integer truncation in socket.setblocking() (GH-10415) On platforms with 64-bit long, socket.setblocking(x) treated all x which lower 32 bits are zero as False due to integer truncation. Reported by ubsan. --- Lib/test/test_socket.py | 4 ++-- .../next/Library/2018-11-08-18-44-04.bpo-35191.s29bWu.rst | 2 ++ Modules/socketmodule.c | 8 +++++--- 3 files changed, 9 insertions(+), 5 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2018-11-08-18-44-04.bpo-35191.s29bWu.rst diff --git a/Lib/test/test_socket.py b/Lib/test/test_socket.py index 3fceac340a461c..f6941e7a6f3771 100644 --- a/Lib/test/test_socket.py +++ b/Lib/test/test_socket.py @@ -4705,10 +4705,10 @@ def testSetBlocking_overflow(self): self.skipTest('needs UINT_MAX < ULONG_MAX') self.serv.setblocking(False) - self.assertEqual(self.serv.gettimeout(), 0.0) + self.assert_sock_timeout(self.serv, 0.0) self.serv.setblocking(_testcapi.UINT_MAX + 1) - self.assertIsNone(self.serv.gettimeout()) + self.assert_sock_timeout(self.serv, None) _testSetBlocking_overflow = support.cpython_only(_testSetBlocking) diff --git a/Misc/NEWS.d/next/Library/2018-11-08-18-44-04.bpo-35191.s29bWu.rst b/Misc/NEWS.d/next/Library/2018-11-08-18-44-04.bpo-35191.s29bWu.rst new file mode 100644 index 00000000000000..37f6f56597b6d4 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2018-11-08-18-44-04.bpo-35191.s29bWu.rst @@ -0,0 +1,2 @@ +Fix unexpected integer truncation in :meth:`socket.setblocking` which caused +it to interpret multiples of ``2**32`` as ``False``. diff --git a/Modules/socketmodule.c b/Modules/socketmodule.c index 8d9d3c3e1cd3e3..997df43f20ca79 100644 --- a/Modules/socketmodule.c +++ b/Modules/socketmodule.c @@ -2815,12 +2815,14 @@ For IP sockets, the address info is a pair (hostaddr, port)."); static PyObject * sock_setblocking(PySocketSockObject *s, PyObject *arg) { - long block; + long value; + int block; - block = PyLong_AsLong(arg); - if (block == -1 && PyErr_Occurred()) + value = PyLong_AsLong(arg); + if (value == -1 && PyErr_Occurred()) return NULL; + block = (value != 0); s->sock_timeout = _PyTime_FromSeconds(block ? -1 : 0); if (internal_setblocking(s, block) == -1) { return NULL; From da332044304eb236036d3b30297907902d935d7a Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 1 Dec 2023 18:28:09 +0100 Subject: [PATCH 245/265] [3.11] [3.12] gh-109413: libregrtest: enable mypy's `--strict-optional` check on most files (GH-112586) (GH-112602) (#112603) [3.12] gh-109413: libregrtest: enable mypy's `--strict-optional` check on most files (GH-112586) (GH-112602) gh-109413: libregrtest: enable mypy's `--strict-optional` check on most files (GH-112586) (cherry picked from commit 36dbebed44785579034ffb1a59f0a1a5b987c9bf) Co-authored-by: Alex Waygood Co-authored-by: Victor Stinner --- Lib/test/libregrtest/mypy.ini | 33 +++++++++++++++++++++++++++++++++ Lib/test/libregrtest/results.py | 2 ++ Lib/test/libregrtest/single.py | 8 ++++---- Lib/test/libregrtest/utils.py | 9 +++++++++ 4 files changed, 48 insertions(+), 4 deletions(-) create mode 100644 Lib/test/libregrtest/mypy.ini diff --git a/Lib/test/libregrtest/mypy.ini b/Lib/test/libregrtest/mypy.ini new file mode 100644 index 00000000000000..22c7c7a9acef14 --- /dev/null +++ b/Lib/test/libregrtest/mypy.ini @@ -0,0 +1,33 @@ +# Config file for running mypy on libregrtest. +# Run mypy by invoking `mypy --config-file Lib/test/libregrtest/mypy.ini` +# on the command-line from the repo root + +[mypy] +files = Lib/test/libregrtest +explicit_package_bases = True +python_version = 3.12 +platform = linux +pretty = True + +# Enable most stricter settings +enable_error_code = ignore-without-code +strict = True + +# Various stricter settings that we can't yet enable +# Try to enable these in the following order: +disallow_any_generics = False +disallow_incomplete_defs = False +disallow_untyped_calls = False +disallow_untyped_defs = False +check_untyped_defs = False +warn_return_any = False + +disable_error_code = return + +# Enable --strict-optional for these ASAP: +[mypy-Lib.test.libregrtest.main.*,Lib.test.libregrtest.run_workers.*] +strict_optional = False + +# Various internal modules that typeshed deliberately doesn't have stubs for: +[mypy-_abc.*,_opcode.*,_overlapped.*,_testcapi.*,_testinternalcapi.*,test.*] +ignore_missing_imports = True diff --git a/Lib/test/libregrtest/results.py b/Lib/test/libregrtest/results.py index 1feb43f8c074db..ad1744899cadc0 100644 --- a/Lib/test/libregrtest/results.py +++ b/Lib/test/libregrtest/results.py @@ -114,6 +114,8 @@ def accumulate_result(self, result: TestResult, runtests: RunTests): self.worker_bug = True if result.has_meaningful_duration() and not rerun: + if result.duration is None: + raise ValueError("result.duration is None") self.test_times.append((result.duration, test_name)) if result.stats is not None: self.stats.accumulate(result.stats) diff --git a/Lib/test/libregrtest/single.py b/Lib/test/libregrtest/single.py index 5c7bc7d40fb394..eafeb5fe26f3f3 100644 --- a/Lib/test/libregrtest/single.py +++ b/Lib/test/libregrtest/single.py @@ -237,11 +237,11 @@ def _runtest(result: TestResult, runtests: RunTests) -> None: output_on_failure = runtests.output_on_failure timeout = runtests.timeout - use_timeout = ( - timeout is not None and threading_helper.can_start_thread - ) - if use_timeout: + if timeout is not None and threading_helper.can_start_thread: + use_timeout = True faulthandler.dump_traceback_later(timeout, exit=True) + else: + use_timeout = False try: setup_tests(runtests) diff --git a/Lib/test/libregrtest/utils.py b/Lib/test/libregrtest/utils.py index 25f3e853137316..5eb2d72186b2e6 100644 --- a/Lib/test/libregrtest/utils.py +++ b/Lib/test/libregrtest/utils.py @@ -372,10 +372,19 @@ def get_temp_dir(tmp_dir: StrPath | None = None) -> StrPath: # Python out of the source tree, especially when the # source tree is read only. tmp_dir = sysconfig.get_config_var('srcdir') + if not tmp_dir: + raise RuntimeError( + "Could not determine the correct value for tmp_dir" + ) tmp_dir = os.path.join(tmp_dir, 'build') else: # WASI platform tmp_dir = sysconfig.get_config_var('projectbase') + if not tmp_dir: + raise RuntimeError( + "sysconfig.get_config_var('projectbase') " + f"unexpectedly returned {tmp_dir!r} on WASI" + ) tmp_dir = os.path.join(tmp_dir, 'build') # When get_temp_dir() is called in a worker process, From 720f5bf1ac24b618653157228e97f74298160e01 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Fri, 1 Dec 2023 20:09:00 +0100 Subject: [PATCH 246/265] [3.11] [3.12] gh-109413: libregrtest: Backport `.py`-file changes from GH-112558 (GH-112605) (#112607) [3.12] gh-109413: libregrtest: Backport `.py`-file changes from GH-112558 (GH-112605) (cherry picked from commit acc62db8af4e472a4922c10b91c60c36c49a5f79) Co-authored-by: Alex Waygood --- Lib/test/libregrtest/cmdline.py | 4 ++-- Lib/test/libregrtest/refleak.py | 3 ++- Lib/test/libregrtest/results.py | 2 +- Lib/test/libregrtest/run_workers.py | 9 ++++++--- Lib/test/libregrtest/runtests.py | 3 ++- Lib/test/libregrtest/setup.py | 3 ++- Lib/test/libregrtest/utils.py | 6 +++--- 7 files changed, 18 insertions(+), 12 deletions(-) diff --git a/Lib/test/libregrtest/cmdline.py b/Lib/test/libregrtest/cmdline.py index 9ba6ec63cad9f8..23ca3565662146 100644 --- a/Lib/test/libregrtest/cmdline.py +++ b/Lib/test/libregrtest/cmdline.py @@ -3,7 +3,7 @@ import shlex import sys from test.support import os_helper -from .utils import ALL_RESOURCES, RESOURCE_NAMES +from .utils import ALL_RESOURCES, RESOURCE_NAMES, TestFilter USAGE = """\ @@ -161,7 +161,7 @@ def __init__(self, **kwargs) -> None: self.forever = False self.header = False self.failfast = False - self.match_tests = [] + self.match_tests: TestFilter = [] self.pgo = False self.pgo_extended = False self.worker_json = None diff --git a/Lib/test/libregrtest/refleak.py b/Lib/test/libregrtest/refleak.py index 59f48bd1e3cf78..8d18ff3ae96db5 100644 --- a/Lib/test/libregrtest/refleak.py +++ b/Lib/test/libregrtest/refleak.py @@ -52,7 +52,8 @@ def runtest_refleak(test_name, test_func, except ImportError: zdc = None # Run unmodified on platforms without zipimport support else: - zdc = zipimport._zip_directory_cache.copy() + # private attribute that mypy doesn't know about: + zdc = zipimport._zip_directory_cache.copy() # type: ignore[attr-defined] abcs = {} for abc in [getattr(collections.abc, a) for a in collections.abc.__all__]: if not isabstract(abc): diff --git a/Lib/test/libregrtest/results.py b/Lib/test/libregrtest/results.py index ad1744899cadc0..477de777f2a9ad 100644 --- a/Lib/test/libregrtest/results.py +++ b/Lib/test/libregrtest/results.py @@ -33,7 +33,7 @@ def __init__(self): self.test_times: list[tuple[float, TestName]] = [] self.stats = TestStats() # used by --junit-xml - self.testsuite_xml: list[str] = [] + self.testsuite_xml: list = [] def is_all_good(self): return (not self.bad diff --git a/Lib/test/libregrtest/run_workers.py b/Lib/test/libregrtest/run_workers.py index 25ca8623c687c5..7d2c1cd84cd1ee 100644 --- a/Lib/test/libregrtest/run_workers.py +++ b/Lib/test/libregrtest/run_workers.py @@ -10,7 +10,7 @@ import threading import time import traceback -from typing import Literal, TextIO +from typing import Any, Literal, TextIO from test import support from test.support import os_helper, MS_WINDOWS @@ -243,7 +243,9 @@ def create_json_file(self, stack: contextlib.ExitStack) -> tuple[JsonFile, TextI json_fd = json_tmpfile.fileno() if MS_WINDOWS: - json_handle = msvcrt.get_osfhandle(json_fd) + # The msvcrt module is only available on Windows; + # we run mypy with `--platform=linux` in CI + json_handle: int = msvcrt.get_osfhandle(json_fd) # type: ignore[attr-defined] json_file = JsonFile(json_handle, JsonFileType.WINDOWS_HANDLE) else: @@ -257,7 +259,7 @@ def create_worker_runtests(self, test_name: TestName, json_file: JsonFile) -> Wo else: match_tests = None - kwargs = {} + kwargs: dict[str, Any] = {} if match_tests: kwargs['match_tests'] = [(test, True) for test in match_tests] if self.runtests.output_on_failure: @@ -343,6 +345,7 @@ def _runtest(self, test_name: TestName) -> MultiprocessResult: json_file, json_tmpfile = self.create_json_file(stack) worker_runtests = self.create_worker_runtests(test_name, json_file) + retcode: str | int | None retcode, tmp_files = self.run_tmp_files(worker_runtests, stdout_file.fileno()) diff --git a/Lib/test/libregrtest/runtests.py b/Lib/test/libregrtest/runtests.py index 283c104c1afb37..4bf2b1fb576e04 100644 --- a/Lib/test/libregrtest/runtests.py +++ b/Lib/test/libregrtest/runtests.py @@ -33,7 +33,8 @@ def configure_subprocess(self, popen_kwargs: dict) -> None: popen_kwargs['pass_fds'] = [self.file] case JsonFileType.WINDOWS_HANDLE: # Windows handle - startupinfo = subprocess.STARTUPINFO() + # We run mypy with `--platform=linux` so it complains about this: + startupinfo = subprocess.STARTUPINFO() # type: ignore[attr-defined] startupinfo.lpAttributeList = {"handle_list": [self.file]} popen_kwargs['startupinfo'] = startupinfo diff --git a/Lib/test/libregrtest/setup.py b/Lib/test/libregrtest/setup.py index 97edba9f87d7f9..9e9741493e9a5b 100644 --- a/Lib/test/libregrtest/setup.py +++ b/Lib/test/libregrtest/setup.py @@ -124,7 +124,8 @@ def setup_tests(runtests: RunTests): support.LONG_TIMEOUT = min(support.LONG_TIMEOUT, timeout) if runtests.hunt_refleak: - unittest.BaseTestSuite._cleanup = False + # private attribute that mypy doesn't know about: + unittest.BaseTestSuite._cleanup = False # type: ignore[attr-defined] if runtests.gc_threshold is not None: gc.set_threshold(runtests.gc_threshold) diff --git a/Lib/test/libregrtest/utils.py b/Lib/test/libregrtest/utils.py index 5eb2d72186b2e6..5a0271bcc17d80 100644 --- a/Lib/test/libregrtest/utils.py +++ b/Lib/test/libregrtest/utils.py @@ -12,7 +12,7 @@ import sysconfig import tempfile import textwrap -from collections.abc import Callable +from collections.abc import Callable, Iterable from test import support from test.support import os_helper @@ -551,7 +551,7 @@ def is_cross_compiled(): return ('_PYTHON_HOST_PLATFORM' in os.environ) -def format_resources(use_resources: tuple[str, ...]): +def format_resources(use_resources: Iterable[str]): use_resources = set(use_resources) all_resources = set(ALL_RESOURCES) @@ -591,7 +591,7 @@ def display_header(use_resources: tuple[str, ...], print("== Python build:", ' '.join(get_build_info())) print("== cwd:", os.getcwd()) - cpu_count = os.cpu_count() + cpu_count: object = os.cpu_count() if cpu_count: affinity = process_cpu_count() if affinity and affinity != cpu_count: From 8d8efe7e7148303b4ecffbe54109356c8ef4ea77 Mon Sep 17 00:00:00 2001 From: Alex Waygood Date: Sat, 2 Dec 2023 22:50:16 +0000 Subject: [PATCH 247/265] [3.11] gh-112316: Improve docs of `inspect.signature` and `Signature.from_callable` (#112317) (#112630) (cherry-picked from commit a74daba7ca) Co-authored-by: Nikita Sobolev --- Doc/library/inspect.rst | 37 ++++++++++++++++++------------------- 1 file changed, 18 insertions(+), 19 deletions(-) diff --git a/Doc/library/inspect.rst b/Doc/library/inspect.rst index e95196376bb24b..1eda20e3609388 100644 --- a/Doc/library/inspect.rst +++ b/Doc/library/inspect.rst @@ -599,7 +599,7 @@ function. .. function:: signature(callable, *, follow_wrapped=True, globals=None, locals=None, eval_str=False) - Return a :class:`Signature` object for the given ``callable``:: + Return a :class:`Signature` object for the given *callable*:: >>> from inspect import signature >>> def foo(a, *, b:int, **kwargs): @@ -625,29 +625,30 @@ function. For objects defined in modules using stringized annotations (``from __future__ import annotations``), :func:`signature` will attempt to automatically un-stringize the annotations using - :func:`inspect.get_annotations()`. The - ``global``, ``locals``, and ``eval_str`` parameters are passed - into :func:`inspect.get_annotations()` when resolving the - annotations; see the documentation for :func:`inspect.get_annotations()` + :func:`get_annotations`. The + *global*, *locals*, and *eval_str* parameters are passed + into :func:`get_annotations` when resolving the + annotations; see the documentation for :func:`get_annotations` for instructions on how to use these parameters. Raises :exc:`ValueError` if no signature can be provided, and :exc:`TypeError` if that type of object is not supported. Also, - if the annotations are stringized, and ``eval_str`` is not false, - the ``eval()`` call(s) to un-stringize the annotations could - potentially raise any kind of exception. + if the annotations are stringized, and *eval_str* is not false, + the ``eval()`` call(s) to un-stringize the annotations in :func:`get_annotations` + could potentially raise any kind of exception. A slash(/) in the signature of a function denotes that the parameters prior to it are positional-only. For more info, see :ref:`the FAQ entry on positional-only parameters `. - .. versionadded:: 3.5 - ``follow_wrapped`` parameter. Pass ``False`` to get a signature of - ``callable`` specifically (``callable.__wrapped__`` will not be used to + .. versionchanged:: 3.5 + The *follow_wrapped* parameter was added. + Pass ``False`` to get a signature of + *callable* specifically (``callable.__wrapped__`` will not be used to unwrap decorated callables.) - .. versionadded:: 3.10 - ``globals``, ``locals``, and ``eval_str`` parameters. + .. versionchanged:: 3.10 + The *globals*, *locals*, and *eval_str* parameters were added. .. note:: @@ -727,12 +728,10 @@ function. >>> str(new_sig) "(a, b) -> 'new return anno'" - .. classmethod:: Signature.from_callable(obj, *, follow_wrapped=True, globalns=None, localns=None) + .. classmethod:: Signature.from_callable(obj, *, follow_wrapped=True, globals=None, locals=None, eval_str=False) Return a :class:`Signature` (or its subclass) object for a given callable - ``obj``. Pass ``follow_wrapped=False`` to get a signature of ``obj`` - without unwrapping its ``__wrapped__`` chain. ``globalns`` and - ``localns`` will be used as the namespaces when resolving annotations. + *obj*. This method simplifies subclassing of :class:`Signature`:: @@ -745,8 +744,8 @@ function. .. versionadded:: 3.5 - .. versionadded:: 3.10 - ``globalns`` and ``localns`` parameters. + .. versionchanged:: 3.10 + The *globals*, *locals*, and *eval_str* parameters were added. .. class:: Parameter(name, kind, *, default=Parameter.empty, annotation=Parameter.empty) From db537702e4ff6801699204a72ec635e39d885735 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 3 Dec 2023 01:58:13 +0100 Subject: [PATCH 248/265] [3.11] [3.12] gh-112618: Make Annotated cache typed (GH-112619) (GH-112628) (#112633) [3.12] gh-112618: Make Annotated cache typed (GH-112619) (GH-112628) (cherry picked from commit 2a378ca2efcb91db4b04fb4e052424fd610ffbc9) Co-authored-by: Alex Waygood --- Lib/test/test_typing.py | 34 +++++++++++++++++++ Lib/typing.py | 9 +++-- ...-12-02-12-55-17.gh-issue-112618.7_FT8-.rst | 2 ++ 3 files changed, 43 insertions(+), 2 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-12-02-12-55-17.gh-issue-112618.7_FT8-.rst diff --git a/Lib/test/test_typing.py b/Lib/test/test_typing.py index f663fbd8679e0e..b0e3ca96fd1c7f 100644 --- a/Lib/test/test_typing.py +++ b/Lib/test/test_typing.py @@ -7627,6 +7627,40 @@ class X(Annotated[int, (1, 10)]): ... self.assertEqual(X.__mro__, (X, int, object), "Annotated should be transparent.") + def test_annotated_cached_with_types(self): + class A(str): ... + class B(str): ... + + field_a1 = Annotated[str, A("X")] + field_a2 = Annotated[str, B("X")] + a1_metadata = field_a1.__metadata__[0] + a2_metadata = field_a2.__metadata__[0] + + self.assertIs(type(a1_metadata), A) + self.assertEqual(a1_metadata, A("X")) + self.assertIs(type(a2_metadata), B) + self.assertEqual(a2_metadata, B("X")) + self.assertIsNot(type(a1_metadata), type(a2_metadata)) + + field_b1 = Annotated[str, A("Y")] + field_b2 = Annotated[str, B("Y")] + b1_metadata = field_b1.__metadata__[0] + b2_metadata = field_b2.__metadata__[0] + + self.assertIs(type(b1_metadata), A) + self.assertEqual(b1_metadata, A("Y")) + self.assertIs(type(b2_metadata), B) + self.assertEqual(b2_metadata, B("Y")) + self.assertIsNot(type(b1_metadata), type(b2_metadata)) + + field_c1 = Annotated[int, 1] + field_c2 = Annotated[int, 1.0] + field_c3 = Annotated[int, True] + + self.assertIs(type(field_c1.__metadata__[0]), int) + self.assertIs(type(field_c2.__metadata__[0]), float) + self.assertIs(type(field_c3.__metadata__[0]), bool) + class TypeAliasTests(BaseTestCase): def test_canonical_usage_with_variable_annotation(self): diff --git a/Lib/typing.py b/Lib/typing.py index 95f855b5a88452..85166dfd946314 100644 --- a/Lib/typing.py +++ b/Lib/typing.py @@ -2214,9 +2214,14 @@ class Annotated: def __new__(cls, *args, **kwargs): raise TypeError("Type Annotated cannot be instantiated.") - @_tp_cache def __class_getitem__(cls, params): - if not isinstance(params, tuple) or len(params) < 2: + if not isinstance(params, tuple): + params = (params,) + return cls._class_getitem_inner(cls, *params) + + @_tp_cache(typed=True) + def _class_getitem_inner(cls, *params): + if len(params) < 2: raise TypeError("Annotated[...] should be used " "with at least two arguments (a type and an " "annotation).") diff --git a/Misc/NEWS.d/next/Library/2023-12-02-12-55-17.gh-issue-112618.7_FT8-.rst b/Misc/NEWS.d/next/Library/2023-12-02-12-55-17.gh-issue-112618.7_FT8-.rst new file mode 100644 index 00000000000000..c732de15609c96 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-12-02-12-55-17.gh-issue-112618.7_FT8-.rst @@ -0,0 +1,2 @@ +Fix a caching bug relating to :data:`typing.Annotated`. +``Annotated[str, True]`` is no longer identical to ``Annotated[str, 1]``. From 6ab6c8579ced22579a6fc76d45233e0e2b8724a5 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 3 Dec 2023 12:35:21 +0100 Subject: [PATCH 249/265] [3.11] [3.12] gh-112316: improve docs for `inspect.signature` and `inspect.Signature` (GH-112631) (GH-112649) (#112652) [3.12] gh-112316: improve docs for `inspect.signature` and `inspect.Signature` (GH-112631) (GH-112649) (cherry-picked from commit fc9e24b01fb7da4160b82cef26981d72bb678c13) (cherry picked from commit 6221482f0c459c2e7dbd946265069246a400072a) Co-authored-by: Alex Waygood --- Doc/library/inspect.rst | 84 +++++++++++++++++++++++++---------------- 1 file changed, 51 insertions(+), 33 deletions(-) diff --git a/Doc/library/inspect.rst b/Doc/library/inspect.rst index 1eda20e3609388..c44459cb087868 100644 --- a/Doc/library/inspect.rst +++ b/Doc/library/inspect.rst @@ -1,6 +1,11 @@ :mod:`inspect` --- Inspect live objects ======================================= +.. testsetup:: * + + import inspect + from inspect import * + .. module:: inspect :synopsis: Extract information and source code from live objects. @@ -593,13 +598,16 @@ Introspecting callables with the Signature object .. versionadded:: 3.3 -The Signature object represents the call signature of a callable object and its -return annotation. To retrieve a Signature object, use the :func:`signature` +The :class:`Signature` object represents the call signature of a callable object +and its return annotation. To retrieve a :class:`!Signature` object, +use the :func:`!signature` function. .. function:: signature(callable, *, follow_wrapped=True, globals=None, locals=None, eval_str=False) - Return a :class:`Signature` object for the given *callable*:: + Return a :class:`Signature` object for the given *callable*: + + .. doctest:: >>> from inspect import signature >>> def foo(a, *, b:int, **kwargs): @@ -608,10 +616,10 @@ function. >>> sig = signature(foo) >>> str(sig) - '(a, *, b:int, **kwargs)' + '(a, *, b: int, **kwargs)' >>> str(sig.parameters['b']) - 'b:int' + 'b: int' >>> sig.parameters['b'].annotation @@ -626,7 +634,7 @@ function. (``from __future__ import annotations``), :func:`signature` will attempt to automatically un-stringize the annotations using :func:`get_annotations`. The - *global*, *locals*, and *eval_str* parameters are passed + *globals*, *locals*, and *eval_str* parameters are passed into :func:`get_annotations` when resolving the annotations; see the documentation for :func:`get_annotations` for instructions on how to use these parameters. @@ -659,7 +667,8 @@ function. .. class:: Signature(parameters=None, *, return_annotation=Signature.empty) - A Signature object represents the call signature of a function and its return + A :class:`!Signature` object represents the call signature of a function + and its return annotation. For each parameter accepted by the function it stores a :class:`Parameter` object in its :attr:`parameters` collection. @@ -669,14 +678,14 @@ function. positional-only first, then positional-or-keyword, and that parameters with defaults follow parameters without defaults. - The optional *return_annotation* argument, can be an arbitrary Python object, - is the "return" annotation of the callable. + The optional *return_annotation* argument can be an arbitrary Python object. + It represents the "return" annotation of the callable. - Signature objects are *immutable*. Use :meth:`Signature.replace` to make a + :class:`!Signature` objects are *immutable*. Use :meth:`Signature.replace` to make a modified copy. .. versionchanged:: 3.5 - Signature objects are picklable and :term:`hashable`. + :class:`!Signature` objects are now picklable and :term:`hashable`. .. attribute:: Signature.empty @@ -713,13 +722,15 @@ function. .. method:: Signature.replace(*[, parameters][, return_annotation]) - Create a new Signature instance based on the instance :meth:`replace` was invoked - on. It is possible to pass different ``parameters`` and/or - ``return_annotation`` to override the corresponding properties of the base - signature. To remove return_annotation from the copied Signature, pass in + Create a new :class:`Signature` instance based on the instance + :meth:`replace` was invoked on. + It is possible to pass different *parameters* and/or + *return_annotation* to override the corresponding properties of the base + signature. To remove ``return_annotation`` from the copied + :class:`!Signature`, pass in :attr:`Signature.empty`. - :: + .. doctest:: >>> def test(a, b): ... pass @@ -733,12 +744,14 @@ function. Return a :class:`Signature` (or its subclass) object for a given callable *obj*. - This method simplifies subclassing of :class:`Signature`:: + This method simplifies subclassing of :class:`Signature`: - class MySignature(Signature): - pass - sig = MySignature.from_callable(min) - assert isinstance(sig, MySignature) + .. testcode:: + + class MySignature(Signature): + pass + sig = MySignature.from_callable(sum) + assert isinstance(sig, MySignature) Its behavior is otherwise identical to that of :func:`signature`. @@ -750,11 +763,12 @@ function. .. class:: Parameter(name, kind, *, default=Parameter.empty, annotation=Parameter.empty) - Parameter objects are *immutable*. Instead of modifying a Parameter object, + :class:`!Parameter` objects are *immutable*. + Instead of modifying a :class:`!Parameter` object, you can use :meth:`Parameter.replace` to create a modified copy. .. versionchanged:: 3.5 - Parameter objects are picklable and :term:`hashable`. + Parameter objects are now picklable and :term:`hashable`. .. attribute:: Parameter.empty @@ -773,7 +787,7 @@ function. expressions. .. versionchanged:: 3.6 - These parameter names are exposed by this module as names like + These parameter names are now exposed by this module as names like ``implicit0``. .. attribute:: Parameter.default @@ -823,7 +837,9 @@ function. | | definition. | +------------------------+----------------------------------------------+ - Example: print all keyword-only arguments without default values:: + Example: print all keyword-only arguments without default values: + + .. doctest:: >>> def foo(a, b, *, c, d=10): ... pass @@ -837,11 +853,13 @@ function. .. attribute:: Parameter.kind.description - Describes a enum value of Parameter.kind. + Describes a enum value of :attr:`Parameter.kind`. .. versionadded:: 3.8 - Example: print all descriptions of arguments:: + Example: print all descriptions of arguments: + + .. doctest:: >>> def foo(a, b, *, c, d=10): ... pass @@ -856,12 +874,12 @@ function. .. method:: Parameter.replace(*[, name][, kind][, default][, annotation]) - Create a new Parameter instance based on the instance replaced was invoked - on. To override a :class:`Parameter` attribute, pass the corresponding + Create a new :class:`Parameter` instance based on the instance replaced was invoked + on. To override a :class:`!Parameter` attribute, pass the corresponding argument. To remove a default value or/and an annotation from a - Parameter, pass :attr:`Parameter.empty`. + :class:`!Parameter`, pass :attr:`Parameter.empty`. - :: + .. doctest:: >>> from inspect import Parameter >>> param = Parameter('foo', Parameter.KEYWORD_ONLY, default=42) @@ -872,10 +890,10 @@ function. 'foo=42' >>> str(param.replace(default=Parameter.empty, annotation='spam')) - "foo:'spam'" + "foo: 'spam'" .. versionchanged:: 3.4 - In Python 3.3 Parameter objects were allowed to have ``name`` set + In Python 3.3 :class:`Parameter` objects were allowed to have ``name`` set to ``None`` if their ``kind`` was set to ``POSITIONAL_ONLY``. This is no longer permitted. From 999ff4f94a0adf0d9e292216cd82c19ad1f6c0ca Mon Sep 17 00:00:00 2001 From: Alex Waygood Date: Sun, 3 Dec 2023 12:01:33 +0000 Subject: [PATCH 250/265] [3.11] Run more `inspect.rst` code snippets in CI (#112654) (#112656) (cherry-picked from commit 4ed46d224401243399b41c7ceef4532bd249da27) --- Doc/library/inspect.rst | 65 ++++++++++++++++++++++++----------------- 1 file changed, 38 insertions(+), 27 deletions(-) diff --git a/Doc/library/inspect.rst b/Doc/library/inspect.rst index c44459cb087868..a9cf64b2c564c3 100644 --- a/Doc/library/inspect.rst +++ b/Doc/library/inspect.rst @@ -371,7 +371,11 @@ attributes (see :ref:`import-mod-attrs` for module attributes): Return ``True`` if the object can be used in :keyword:`await` expression. Can also be used to distinguish generator-based coroutines from regular - generators:: + generators: + + .. testcode:: + + import types def gen(): yield @@ -388,13 +392,15 @@ attributes (see :ref:`import-mod-attrs` for module attributes): .. function:: isasyncgenfunction(object) Return ``True`` if the object is an :term:`asynchronous generator` function, - for example:: + for example: - >>> async def agen(): - ... yield 1 - ... - >>> inspect.isasyncgenfunction(agen) - True + .. doctest:: + + >>> async def agen(): + ... yield 1 + ... + >>> inspect.isasyncgenfunction(agen) + True .. versionadded:: 3.6 @@ -946,18 +952,20 @@ function. For variable-keyword arguments (``**kwargs``) the default is an empty dict. - :: + .. doctest:: - >>> def foo(a, b='ham', *args): pass - >>> ba = inspect.signature(foo).bind('spam') - >>> ba.apply_defaults() - >>> ba.arguments - {'a': 'spam', 'b': 'ham', 'args': ()} + >>> def foo(a, b='ham', *args): pass + >>> ba = inspect.signature(foo).bind('spam') + >>> ba.apply_defaults() + >>> ba.arguments + {'a': 'spam', 'b': 'ham', 'args': ()} .. versionadded:: 3.5 The :attr:`args` and :attr:`kwargs` properties can be used to invoke - functions:: + functions: + + .. testcode:: def test(a, *, b): ... @@ -1076,19 +1084,22 @@ Classes and functions ``**`` arguments, if any) to their values from *args* and *kwds*. In case of invoking *func* incorrectly, i.e. whenever ``func(*args, **kwds)`` would raise an exception because of incompatible signature, an exception of the same type - and the same or similar message is raised. For example:: - - >>> from inspect import getcallargs - >>> def f(a, b=1, *pos, **named): - ... pass - >>> getcallargs(f, 1, 2, 3) == {'a': 1, 'named': {}, 'b': 2, 'pos': (3,)} - True - >>> getcallargs(f, a=2, x=4) == {'a': 2, 'named': {'x': 4}, 'b': 1, 'pos': ()} - True - >>> getcallargs(f) - Traceback (most recent call last): - ... - TypeError: f() missing 1 required positional argument: 'a' + and the same or similar message is raised. For example: + + .. doctest:: + + >>> from inspect import getcallargs + >>> def f(a, b=1, *pos, **named): + ... pass + ... + >>> getcallargs(f, 1, 2, 3) == {'a': 1, 'named': {}, 'b': 2, 'pos': (3,)} + True + >>> getcallargs(f, a=2, x=4) == {'a': 2, 'named': {'x': 4}, 'b': 1, 'pos': ()} + True + >>> getcallargs(f) + Traceback (most recent call last): + ... + TypeError: f() missing 1 required positional argument: 'a' .. versionadded:: 3.2 From 3d87a988a6a36acfeef34c134070d53dc5e6dc4d Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 3 Dec 2023 16:20:51 +0100 Subject: [PATCH 251/265] [3.11] Fix link to 'The Perils of Floating Point', on the tutorial (GH-112499) (GH-112663) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Use author link to 'The Perils of Floating Point'. (cherry picked from commit c27b09c81368bc3b756e94a79a39307ce44a4a2c) Co-authored-by: Marco Aurélio A. Barbosa --- Doc/tutorial/floatingpoint.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Doc/tutorial/floatingpoint.rst b/Doc/tutorial/floatingpoint.rst index 40c38be2fcd1a3..cffd64eb6a3fd3 100644 --- a/Doc/tutorial/floatingpoint.rst +++ b/Doc/tutorial/floatingpoint.rst @@ -131,7 +131,7 @@ section. See `Examples of Floating Point Problems `_ for a pleasant summary of how binary floating-point works and the kinds of problems commonly encountered in practice. Also see -`The Perils of Floating Point `_ +`The Perils of Floating Point `_ for a more complete account of other common surprises. As that says near the end, "there are no easy answers." Still, don't be unduly From 36ad8e61a60c9f512378102b016796e2dfebef39 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 3 Dec 2023 17:34:37 +0100 Subject: [PATCH 252/265] [3.11] gh-66819: More IDLE htest updates(2) (GH-112642) (#112644) Examine and update spec -- callable pairs. Revise run method. (cherry picked from commit 3855b45874d5f8eb92a4957fb9de6fdce63eb760) Co-authored-by: Terry Jan Reedy --- Lib/idlelib/browser.py | 1 + Lib/idlelib/debugobj.py | 4 +- Lib/idlelib/idle_test/htest.py | 108 ++++++++++++++++----------------- Lib/idlelib/iomenu.py | 9 +-- Lib/idlelib/multicall.py | 2 + Lib/idlelib/pathbrowser.py | 6 +- Lib/idlelib/query.py | 2 +- Lib/idlelib/sidebar.py | 6 +- Lib/idlelib/statusbar.py | 1 + 9 files changed, 70 insertions(+), 69 deletions(-) diff --git a/Lib/idlelib/browser.py b/Lib/idlelib/browser.py index 4fe64dced60aca..672e229ffbca94 100644 --- a/Lib/idlelib/browser.py +++ b/Lib/idlelib/browser.py @@ -254,5 +254,6 @@ class Nested_in_closure: pass if len(sys.argv) == 1: # If pass file on command line, unittest fails. from unittest import main main('idlelib.idle_test.test_browser', verbosity=2, exit=False) + from idlelib.idle_test.htest import run run(_module_browser) diff --git a/Lib/idlelib/debugobj.py b/Lib/idlelib/debugobj.py index 032b686f379378..0bf2cb1d5bbfe2 100644 --- a/Lib/idlelib/debugobj.py +++ b/Lib/idlelib/debugobj.py @@ -120,7 +120,7 @@ def make_objecttreeitem(labeltext, object, setfunction=None): return c(labeltext, object, setfunction) -def _object_browser(parent): # htest # +def _debug_object_browser(parent): # htest # import sys from tkinter import Toplevel top = Toplevel(parent) @@ -140,4 +140,4 @@ def _object_browser(parent): # htest # main('idlelib.idle_test.test_debugobj', verbosity=2, exit=False) from idlelib.idle_test.htest import run - run(_object_browser) + run(_debug_object_browser) diff --git a/Lib/idlelib/idle_test/htest.py b/Lib/idlelib/idle_test/htest.py index e21ab98d8aab89..a59b474fba47d8 100644 --- a/Lib/idlelib/idle_test/htest.py +++ b/Lib/idlelib/idle_test/htest.py @@ -8,14 +8,15 @@ In a tested module, let X be a global name bound to a callable (class or function) whose .__name__ attribute is also X (the usual situation). The -first parameter of X must be 'parent'. When called, the parent argument -will be the root window. X must create a child Toplevel(parent) window -(or subclass thereof). The Toplevel may be a test widget or dialog, in -which case the callable is the corresponding class. Or the Toplevel may -contain the widget to be tested or set up a context in which a test -widget is invoked. In this latter case, the callable is a wrapper -function that sets up the Toplevel and other objects. Wrapper function -names, such as _editor_window', should start with '_' and be lowercase. +first parameter of X must be 'parent' or 'master'. When called, the +first argument will be the root window. X must create a child +Toplevel(parent/master) (or subclass thereof). The Toplevel may be a +test widget or dialog, in which case the callable is the corresponding +class. Or the Toplevel may contain the widget to be tested or set up a +context in which a test widget is invoked. In this latter case, the +callable is a wrapper function that sets up the Toplevel and other +objects. Wrapper function names, such as _editor_window', should start +with '_' and be lowercase. End the module with @@ -117,12 +118,20 @@ 'file': 'query', 'kwds': {'title': 'Customize query.py Run', '_htest': True}, - 'msg': "Enter with or [Run]. Print valid entry to Shell\n" + 'msg': "Enter with or [OK]. Print valid entry to Shell\n" "Arguments are parsed into a list\n" "Mode is currently restart True or False\n" "Close dialog with valid entry, , [Cancel], [X]" } +_debug_object_browser_spec = { + 'file': 'debugobj', + 'kwds': {}, + 'msg': "Double click on items up to the lowest level.\n" + "Attributes of the objects and related information " + "will be displayed side-by-side at each level." + } + # TODO Improve message _dyn_option_menu_spec = { 'file': 'dynoption', @@ -178,7 +187,7 @@ "Any url ('www...', 'http...') is accepted.\n" "Test Browse with and without path, as cannot unittest.\n" "[Ok] or prints valid entry to shell\n" - "[Cancel] or prints None to shell" + ", [Cancel], or [X] prints None to shell" } _io_binding_spec = { @@ -199,17 +208,17 @@ 'kwds': {}, 'msg': textwrap.dedent("""\ 1. Click on the line numbers and drag down below the edge of the - window, moving the mouse a bit and then leaving it there for a while. - The text and line numbers should gradually scroll down, with the - selection updated continuously. + window, moving the mouse a bit and then leaving it there for a + while. The text and line numbers should gradually scroll down, + with the selection updated continuously. - 2. With the lines still selected, click on a line number above the - selected lines. Only the line whose number was clicked should be - selected. + 2. With the lines still selected, click on a line number above + or below the selected lines. Only the line whose number was + clicked should be selected. - 3. Repeat step #1, dragging to above the window. The text and line - numbers should gradually scroll up, with the selection updated - continuously. + 3. Repeat step #1, dragging to above the window. The text and + line numbers should gradually scroll up, with the selection + updated continuously. 4. Repeat step #2, clicking a line number below the selection."""), } @@ -217,42 +226,33 @@ _multi_call_spec = { 'file': 'multicall', 'kwds': {}, - 'msg': "The following actions should trigger a print to console or IDLE" - " Shell.\nEntering and leaving the text area, key entry, " - ",\n, , " - ", \n, and " - "focusing out of the window\nare sequences to be tested." + 'msg': "The following should trigger a print to console or IDLE Shell.\n" + "Entering and leaving the text area, key entry, ,\n" + ", , , \n" + ", and focusing elsewhere." } _module_browser_spec = { 'file': 'browser', 'kwds': {}, - 'msg': "Inspect names of module, class(with superclass if " - "applicable), methods and functions.\nToggle nested items.\n" - "Double clicking on items prints a traceback for an exception " - "that is ignored." + 'msg': textwrap.dedent(""" + "Inspect names of module, class(with superclass if applicable), + "methods and functions. Toggle nested items. Double clicking + "on items prints a traceback for an exception that is ignored.""") } _multistatus_bar_spec = { 'file': 'statusbar', 'kwds': {}, 'msg': "Ensure presence of multi-status bar below text area.\n" - "Click 'Update Status' to change the multi-status text" - } - -_object_browser_spec = { - 'file': 'debugobj', - 'kwds': {}, - 'msg': "Double click on items up to the lowest level.\n" - "Attributes of the objects and related information " - "will be displayed side-by-side at each level." + "Click 'Update Status' to change the status text" } -_path_browser_spec = { +PathBrowser_spec = { 'file': 'pathbrowser', - 'kwds': {}, + 'kwds': {'_htest': True}, 'msg': "Test for correct display of all paths in sys.path.\n" - "Toggle nested items up to the lowest level.\n" + "Toggle nested items out to the lowest level.\n" "Double clicking on an item prints a traceback\n" "for an exception that is ignored." } @@ -367,11 +367,12 @@ } def run(*tests): + "Run callables in tests." root = tk.Tk() root.title('IDLE htest') root.resizable(0, 0) - # a scrollable Label like constant width text widget. + # A scrollable Label-like constant width text widget. frameLabel = tk.Frame(root, padx=10) frameLabel.pack() text = tk.Text(frameLabel, wrap='word') @@ -381,45 +382,44 @@ def run(*tests): scrollbar.pack(side='right', fill='y', expand=False) text.pack(side='left', fill='both', expand=True) - test_list = [] # List of tuples of the form (spec, callable widget) + test_list = [] # Make list of (spec, callable) tuples. if tests: for test in tests: test_spec = globals()[test.__name__ + '_spec'] test_spec['name'] = test.__name__ test_list.append((test_spec, test)) else: - for k, d in globals().items(): - if k.endswith('_spec'): - test_name = k[:-5] - test_spec = d + for key, dic in globals().items(): + if key.endswith('_spec'): + test_name = key[:-5] + test_spec = dic test_spec['name'] = test_name mod = import_module('idlelib.' + test_spec['file']) test = getattr(mod, test_name) test_list.append((test_spec, test)) + test_list.reverse() # So can pop in proper order in next_test. test_name = tk.StringVar(root) callable_object = None test_kwds = None def next_test(): - nonlocal test_name, callable_object, test_kwds if len(test_list) == 1: next_button.pack_forget() test_spec, callable_object = test_list.pop() test_kwds = test_spec['kwds'] - test_kwds['parent'] = root test_name.set('Test ' + test_spec['name']) - text.configure(state='normal') # enable text editing - text.delete('1.0','end') - text.insert("1.0",test_spec['msg']) - text.configure(state='disabled') # preserve read-only property + text['state'] = 'normal' # Enable text replacement. + text.delete('1.0', 'end') + text.insert("1.0", test_spec['msg']) + text['state'] = 'disabled' # Restore read-only property. def run_test(_=None): - widget = callable_object(**test_kwds) + widget = callable_object(root, **test_kwds) try: - print(widget.result) + print(widget.result) # Only true for query classes(?). except AttributeError: pass diff --git a/Lib/idlelib/iomenu.py b/Lib/idlelib/iomenu.py index af8159c2b33f51..7629101635b8bb 100644 --- a/Lib/idlelib/iomenu.py +++ b/Lib/idlelib/iomenu.py @@ -396,10 +396,11 @@ def updaterecentfileslist(self,filename): def _io_binding(parent): # htest # from tkinter import Toplevel, Text - root = Toplevel(parent) - root.title("Test IOBinding") + top = Toplevel(parent) + top.title("Test IOBinding") x, y = map(int, parent.geometry().split('+')[1:]) - root.geometry("+%d+%d" % (x, y + 175)) + top.geometry("+%d+%d" % (x, y + 175)) + class MyEditWin: def __init__(self, text): self.text = text @@ -423,7 +424,7 @@ def saveas(self, event): def savecopy(self, event): self.text.event_generate("<>") - text = Text(root) + text = Text(top) text.pack() text.focus_set() editwin = MyEditWin(text) diff --git a/Lib/idlelib/multicall.py b/Lib/idlelib/multicall.py index 0200f445cc9340..2aa4a54125156f 100644 --- a/Lib/idlelib/multicall.py +++ b/Lib/idlelib/multicall.py @@ -421,6 +421,8 @@ def _multi_call(parent): # htest # top.geometry("+%d+%d" % (x, y + 175)) text = MultiCallCreator(tkinter.Text)(top) text.pack() + text.focus_set() + def bindseq(seq, n=[0]): def handler(event): print(seq) diff --git a/Lib/idlelib/pathbrowser.py b/Lib/idlelib/pathbrowser.py index 6de242d0000bed..48a77875ba5801 100644 --- a/Lib/idlelib/pathbrowser.py +++ b/Lib/idlelib/pathbrowser.py @@ -99,13 +99,9 @@ def listmodules(self, allnames): return sorted -def _path_browser(parent): # htest # - PathBrowser(parent, _htest=True) - parent.mainloop() - if __name__ == "__main__": from unittest import main main('idlelib.idle_test.test_pathbrowser', verbosity=2, exit=False) from idlelib.idle_test.htest import run - run(_path_browser) + run(PathBrowser) diff --git a/Lib/idlelib/query.py b/Lib/idlelib/query.py index df02f2123ab02f..57230e2aaca66d 100644 --- a/Lib/idlelib/query.py +++ b/Lib/idlelib/query.py @@ -368,7 +368,7 @@ def create_extra(self): sticky='we') def cli_args_ok(self): - "Validity check and parsing for command line arguments." + "Return command line arg list or None if error." cli_string = self.entry.get().strip() try: cli_args = shlex.split(cli_string, posix=True) diff --git a/Lib/idlelib/sidebar.py b/Lib/idlelib/sidebar.py index fb1084dbf3f18b..5dc17fce9cd720 100644 --- a/Lib/idlelib/sidebar.py +++ b/Lib/idlelib/sidebar.py @@ -517,13 +517,13 @@ def update_colors(self): def _linenumbers_drag_scrolling(parent): # htest # from idlelib.idle_test.test_sidebar import Dummy_editwin - toplevel = tk.Toplevel(parent) - text_frame = tk.Frame(toplevel) + top = tk.Toplevel(parent) + text_frame = tk.Frame(top) text_frame.pack(side=tk.LEFT, fill=tk.BOTH, expand=True) text_frame.rowconfigure(1, weight=1) text_frame.columnconfigure(1, weight=1) - font = idleConf.GetFont(toplevel, 'main', 'EditorWindow') + font = idleConf.GetFont(top, 'main', 'EditorWindow') text = tk.Text(text_frame, width=80, height=24, wrap=tk.NONE, font=font) text.grid(row=1, column=1, sticky=tk.NSEW) diff --git a/Lib/idlelib/statusbar.py b/Lib/idlelib/statusbar.py index 755fafb0ac6438..7048bd64b98753 100644 --- a/Lib/idlelib/statusbar.py +++ b/Lib/idlelib/statusbar.py @@ -26,6 +26,7 @@ def _multistatus_bar(parent): # htest # x, y = map(int, parent.geometry().split('+')[1:]) top.geometry("+%d+%d" %(x, y + 175)) top.title("Test multistatus bar") + frame = Frame(top) text = Text(frame, height=5, width=40) text.pack() From 05ea7e5d4dd4b4f75d2962b24e41c55d656e6169 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 3 Dec 2023 18:39:10 +0100 Subject: [PATCH 253/265] [3.11] gh-101100: Fix most Sphinx nitpicks in `inspect.rst` (GH-112662) (#112667) gh-101100: Fix most Sphinx nitpicks in `inspect.rst` (GH-112662) (cherry picked from commit 45650d1c479a8b0370f126d821718dd3c502f333) Co-authored-by: Alex Waygood --- Doc/conf.py | 3 +++ Doc/library/inspect.rst | 15 +++++++++------ Doc/reference/datamodel.rst | 2 ++ Doc/whatsnew/2.6.rst | 5 +++-- Doc/whatsnew/2.7.rst | 3 ++- 5 files changed, 19 insertions(+), 9 deletions(-) diff --git a/Doc/conf.py b/Doc/conf.py index 3f5ba5d969d313..042efcea285883 100644 --- a/Doc/conf.py +++ b/Doc/conf.py @@ -149,6 +149,9 @@ ('envvar', 'USER'), ('envvar', 'USERNAME'), ('envvar', 'USERPROFILE'), + # Deprecated function that was never documented: + ('py:func', 'getargspec'), + ('py:func', 'inspect.getargspec'), ] # Temporary undocumented names. diff --git a/Doc/library/inspect.rst b/Doc/library/inspect.rst index a9cf64b2c564c3..667ee441e862b4 100644 --- a/Doc/library/inspect.rst +++ b/Doc/library/inspect.rst @@ -273,7 +273,7 @@ attributes (see :ref:`import-mod-attrs` for module attributes): :func:`getmembers` will only return class attributes defined in the metaclass when the argument is a class and those attributes have been - listed in the metaclass' custom :meth:`__dir__`. + listed in the metaclass' custom :meth:`~object.__dir__`. .. function:: getmembers_static(object[, predicate]) @@ -466,12 +466,13 @@ attributes (see :ref:`import-mod-attrs` for module attributes): has a :meth:`~object.__get__` method but not a :meth:`~object.__set__` method, but beyond that the set of attributes varies. A :attr:`~definition.__name__` attribute is usually - sensible, and :attr:`__doc__` often is. + sensible, and :attr:`!__doc__` often is. Methods implemented via descriptors that also pass one of the other tests return ``False`` from the :func:`ismethoddescriptor` test, simply because the other tests promise more -- you can, e.g., count on having the - :attr:`__func__` attribute (etc) when an object passes :func:`ismethod`. + :ref:`__func__ ` attribute (etc) when an object passes + :func:`ismethod`. .. function:: isdatadescriptor(object) @@ -482,7 +483,7 @@ attributes (see :ref:`import-mod-attrs` for module attributes): Examples are properties (defined in Python), getsets, and members. The latter two are defined in C and there are more specific tests available for those types, which is robust across Python implementations. Typically, data - descriptors will also have :attr:`~definition.__name__` and :attr:`__doc__` attributes + descriptors will also have :attr:`~definition.__name__` and :attr:`!__doc__` attributes (properties, getsets, and members have both of these attributes), but this is not guaranteed. @@ -1401,7 +1402,8 @@ Fetching attributes statically Both :func:`getattr` and :func:`hasattr` can trigger code execution when fetching or checking for the existence of attributes. Descriptors, like -properties, will be invoked and :meth:`__getattr__` and :meth:`__getattribute__` +properties, will be invoked and :meth:`~object.__getattr__` and +:meth:`~object.__getattribute__` may be called. For cases where you want passive introspection, like documentation tools, this @@ -1411,7 +1413,8 @@ but avoids executing code when it fetches attributes. .. function:: getattr_static(obj, attr, default=None) Retrieve attributes without triggering dynamic lookup via the - descriptor protocol, :meth:`__getattr__` or :meth:`__getattribute__`. + descriptor protocol, :meth:`~object.__getattr__` + or :meth:`~object.__getattribute__`. Note: this function may not be able to retrieve all attributes that getattr can fetch (like dynamically created attributes) diff --git a/Doc/reference/datamodel.rst b/Doc/reference/datamodel.rst index ce92a120609046..f3efb2a1fa3aa7 100644 --- a/Doc/reference/datamodel.rst +++ b/Doc/reference/datamodel.rst @@ -627,6 +627,8 @@ code object; see the description of internal types below. The module. +.. _instance-methods: + Instance methods ^^^^^^^^^^^^^^^^ diff --git a/Doc/whatsnew/2.6.rst b/Doc/whatsnew/2.6.rst index 2b7ef2cd4fe4a9..7d8987e100f9ee 100644 --- a/Doc/whatsnew/2.6.rst +++ b/Doc/whatsnew/2.6.rst @@ -1677,8 +1677,9 @@ Some smaller changes made to the core Python language are: (:issue:`1591665`) * Instance method objects have new attributes for the object and function - comprising the method; the new synonym for :attr:`im_self` is - :attr:`__self__`, and :attr:`im_func` is also available as :attr:`__func__`. + comprising the method; the new synonym for :attr:`!im_self` is + :ref:`__self__ `, and :attr:`!im_func` is also available as + :ref:`__func__ `. The old names are still supported in Python 2.6, but are gone in 3.0. * An obscure change: when you use the :func:`locals` function inside a diff --git a/Doc/whatsnew/2.7.rst b/Doc/whatsnew/2.7.rst index b9d0ab08e6b724..4cd4aae3451b7d 100644 --- a/Doc/whatsnew/2.7.rst +++ b/Doc/whatsnew/2.7.rst @@ -860,7 +860,8 @@ Some smaller changes made to the core Python language are: * When using ``@classmethod`` and ``@staticmethod`` to wrap methods as class or static methods, the wrapper object now - exposes the wrapped function as their :attr:`__func__` attribute. + exposes the wrapped function as their :ref:`__func__ ` + attribute. (Contributed by Amaury Forgeot d'Arc, after a suggestion by George Sakkis; :issue:`5982`.) From 73411ea95eaf09b926bee4f682bf8e9a1e4825a7 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Sun, 3 Dec 2023 21:30:31 +0100 Subject: [PATCH 254/265] [3.11] gh-101100: Fix Sphinx warning in `library/gettext.rst` (GH-112668) (#112673) gh-101100: Fix Sphinx warning in `library/gettext.rst` (GH-112668) (cherry picked from commit 489aeac3a2d3b347ff033334688e2f44eec7944a) Co-authored-by: Hugo van Kemenade Co-authored-by: Alex Waygood --- Doc/library/gettext.rst | 14 +++++++------- Doc/tools/.nitignore | 1 - 2 files changed, 7 insertions(+), 8 deletions(-) diff --git a/Doc/library/gettext.rst b/Doc/library/gettext.rst index dc6cf5533fccbe..41beac3e0c7396 100644 --- a/Doc/library/gettext.rst +++ b/Doc/library/gettext.rst @@ -257,7 +257,7 @@ are the methods of :class:`!NullTranslations`: .. method:: info() - Return the "protected" :attr:`_info` variable, a dictionary containing + Return a dictionary containing the metadata found in the message catalog file. @@ -296,9 +296,9 @@ are the methods of :class:`!NullTranslations`: The :class:`GNUTranslations` class ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -The :mod:`gettext` module provides one additional class derived from +The :mod:`!gettext` module provides one additional class derived from :class:`NullTranslations`: :class:`GNUTranslations`. This class overrides -:meth:`_parse` to enable reading GNU :program:`gettext` format :file:`.mo` files +:meth:`!_parse` to enable reading GNU :program:`gettext` format :file:`.mo` files in both big-endian and little-endian format. :class:`GNUTranslations` parses optional metadata out of the translation @@ -306,7 +306,7 @@ catalog. It is convention with GNU :program:`gettext` to include metadata as the translation for the empty string. This metadata is in :rfc:`822`\ -style ``key: value`` pairs, and should contain the ``Project-Id-Version`` key. If the key ``Content-Type`` is found, then the ``charset`` property is used to -initialize the "protected" :attr:`_charset` instance variable, defaulting to +initialize the "protected" :attr:`!_charset` instance variable, defaulting to ``None`` if not found. If the charset encoding is specified, then all message ids and message strings read from the catalog are converted to Unicode using this encoding, else ASCII is assumed. @@ -315,7 +315,7 @@ Since message ids are read as Unicode strings too, all ``*gettext()`` methods will assume message ids as Unicode strings, not byte strings. The entire set of key/value pairs are placed into a dictionary and set as the -"protected" :attr:`_info` instance variable. +"protected" :attr:`!_info` instance variable. If the :file:`.mo` file's magic number is invalid, the major version number is unexpected, or if other problems occur while reading the file, instantiating a @@ -636,9 +636,9 @@ implementations, and valuable experience to the creation of this module: .. rubric:: Footnotes -.. [#] The default locale directory is system dependent; for example, on RedHat Linux +.. [#] The default locale directory is system dependent; for example, on Red Hat Linux it is :file:`/usr/share/locale`, but on Solaris it is :file:`/usr/lib/locale`. - The :mod:`gettext` module does not try to support these system dependent + The :mod:`!gettext` module does not try to support these system dependent defaults; instead its default is :file:`{sys.base_prefix}/share/locale` (see :data:`sys.base_prefix`). For this reason, it is always best to call :func:`bindtextdomain` with an explicit absolute path at the start of your diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index 8ae1ab62e69540..9de2b233ea151f 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -62,7 +62,6 @@ Doc/library/fcntl.rst Doc/library/ftplib.rst Doc/library/functions.rst Doc/library/functools.rst -Doc/library/gettext.rst Doc/library/http.client.rst Doc/library/http.cookiejar.rst Doc/library/http.cookies.rst From e347a8e0e3c00ef726ae627686771aadd74fc967 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 4 Dec 2023 06:10:22 +0100 Subject: [PATCH 255/265] [3.11] gh-66819: More IDLE htest updates(3) (GH-112683) (#112685) Revise spec-callable pairs from percolator to end. (cherry picked from commit 5a1b5316af648ae79bb91f28253b6272bbbd2886) Co-authored-by: Terry Jan Reedy --- Lib/idlelib/help.py | 4 ++-- Lib/idlelib/idle_test/htest.py | 16 ++++++++-------- Lib/idlelib/percolator.py | 12 ++++++------ Lib/idlelib/scrolledlist.py | 3 ++- Lib/idlelib/stackviewer.py | 4 ++-- Lib/idlelib/undo.py | 14 +++++++------- 6 files changed, 27 insertions(+), 26 deletions(-) diff --git a/Lib/idlelib/help.py b/Lib/idlelib/help.py index cc027b9cef4f5b..580a327f620a79 100644 --- a/Lib/idlelib/help.py +++ b/Lib/idlelib/help.py @@ -278,7 +278,7 @@ def copy_strip(): out.write(line.rstrip() + b'\n') print(f'{src} copied to {dst}') -def show_idlehelp(parent): +def _helpwindow(parent): "Create HelpWindow; called from Idle Help event handler." filename = join(abspath(dirname(__file__)), 'help.html') if not isfile(filename): @@ -291,4 +291,4 @@ def show_idlehelp(parent): main('idlelib.idle_test.test_help', verbosity=2, exit=False) from idlelib.idle_test.htest import run - run(show_idlehelp) + run(_helpwindow) diff --git a/Lib/idlelib/idle_test/htest.py b/Lib/idlelib/idle_test/htest.py index a59b474fba47d8..4042106bf44a9f 100644 --- a/Lib/idlelib/idle_test/htest.py +++ b/Lib/idlelib/idle_test/htest.py @@ -190,6 +190,13 @@ ", [Cancel], or [X] prints None to shell" } +_helpwindow_spec = { + 'file': 'help', + 'kwds': {}, + 'msg': "If the help text displays, this works.\n" + "Text is selectable. Window is scrollable." + } + _io_binding_spec = { 'file': 'iomenu', 'kwds': {}, @@ -312,14 +319,7 @@ "Right clicking an item will display a popup." } -show_idlehelp_spec = { - 'file': 'help', - 'kwds': {}, - 'msg': "If the help text displays, this works.\n" - "Text is selectable. Window is scrollable." - } - -_stack_viewer_spec = { +_stackbrowser_spec = { 'file': 'stackviewer', 'kwds': {}, 'msg': "A stacktrace for a NameError exception.\n" diff --git a/Lib/idlelib/percolator.py b/Lib/idlelib/percolator.py index 1fe34d29f54eb2..91ad7272f4ae56 100644 --- a/Lib/idlelib/percolator.py +++ b/Lib/idlelib/percolator.py @@ -86,11 +86,11 @@ def delete(self, *args): print(self.name, ": delete", args) self.delegate.delete(*args) - box = tk.Toplevel(parent) - box.title("Test Percolator") + top = tk.Toplevel(parent) + top.title("Test Percolator") x, y = map(int, parent.geometry().split('+')[1:]) - box.geometry("+%d+%d" % (x, y + 175)) - text = tk.Text(box) + top.geometry("+%d+%d" % (x, y + 175)) + text = tk.Text(top) p = Percolator(text) pin = p.insertfilter pout = p.removefilter @@ -104,10 +104,10 @@ def toggle2(): text.pack() var1 = tk.IntVar(parent) - cb1 = tk.Checkbutton(box, text="Tracer1", command=toggle1, variable=var1) + cb1 = tk.Checkbutton(top, text="Tracer1", command=toggle1, variable=var1) cb1.pack() var2 = tk.IntVar(parent) - cb2 = tk.Checkbutton(box, text="Tracer2", command=toggle2, variable=var2) + cb2 = tk.Checkbutton(top, text="Tracer2", command=toggle2, variable=var2) cb2.pack() if __name__ == "__main__": diff --git a/Lib/idlelib/scrolledlist.py b/Lib/idlelib/scrolledlist.py index 71fd18ab19ec8a..4f1241a576fca1 100644 --- a/Lib/idlelib/scrolledlist.py +++ b/Lib/idlelib/scrolledlist.py @@ -132,6 +132,7 @@ def _scrolled_list(parent): # htest # top = Toplevel(parent) x, y = map(int, parent.geometry().split('+')[1:]) top.geometry("+%d+%d" % (x+200, y + 175)) + class MyScrolledList(ScrolledList): def fill_menu(self): self.menu.add_command(label="right click") def on_select(self, index): print("select", self.get(index)) @@ -143,7 +144,7 @@ def on_double(self, index): print("double", self.get(index)) if __name__ == '__main__': from unittest import main - main('idlelib.idle_test.test_scrolledlist', verbosity=2,) + main('idlelib.idle_test.test_scrolledlist', verbosity=2, exit=False) from idlelib.idle_test.htest import run run(_scrolled_list) diff --git a/Lib/idlelib/stackviewer.py b/Lib/idlelib/stackviewer.py index f8e60fd9b6d818..977c56ef15f2ae 100644 --- a/Lib/idlelib/stackviewer.py +++ b/Lib/idlelib/stackviewer.py @@ -113,7 +113,7 @@ def setfunction(value, key=key, object=self.object): return sublist -def _stack_viewer(parent): # htest # +def _stackbrowser(parent): # htest # from idlelib.pyshell import PyShellFileList top = tk.Toplevel(parent) top.title("Test StackViewer") @@ -131,4 +131,4 @@ def _stack_viewer(parent): # htest # main('idlelib.idle_test.test_stackviewer', verbosity=2, exit=False) from idlelib.idle_test.htest import run - run(_stack_viewer) + run(_stackbrowser) diff --git a/Lib/idlelib/undo.py b/Lib/idlelib/undo.py index 5f10c0f05c1acb..f1d03f4c9ed5ef 100644 --- a/Lib/idlelib/undo.py +++ b/Lib/idlelib/undo.py @@ -339,23 +339,23 @@ def bump_depth(self, incr=1): def _undo_delegator(parent): # htest # from tkinter import Toplevel, Text, Button from idlelib.percolator import Percolator - undowin = Toplevel(parent) - undowin.title("Test UndoDelegator") + top = Toplevel(parent) + top.title("Test UndoDelegator") x, y = map(int, parent.geometry().split('+')[1:]) - undowin.geometry("+%d+%d" % (x, y + 175)) + top.geometry("+%d+%d" % (x, y + 175)) - text = Text(undowin, height=10) + text = Text(top, height=10) text.pack() text.focus_set() p = Percolator(text) d = UndoDelegator() p.insertfilter(d) - undo = Button(undowin, text="Undo", command=lambda:d.undo_event(None)) + undo = Button(top, text="Undo", command=lambda:d.undo_event(None)) undo.pack(side='left') - redo = Button(undowin, text="Redo", command=lambda:d.redo_event(None)) + redo = Button(top, text="Redo", command=lambda:d.redo_event(None)) redo.pack(side='left') - dump = Button(undowin, text="Dump", command=lambda:d.dump_event(None)) + dump = Button(top, text="Dump", command=lambda:d.dump_event(None)) dump.pack(side='left') if __name__ == "__main__": From 5ada3e5e91487e010f5377329b9aa12247d77cfb Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 4 Dec 2023 07:59:19 +0100 Subject: [PATCH 256/265] [3.11] gh-66819: More IDLE htest updates(4) (GH-112686) (#112689) Mostly double spacing before 'if __name__...'. (cherry picked from commit e5b0db0315941b968ebcc2414bfcdd2da44fd3c2) Co-authored-by: Terry Jan Reedy --- Lib/idlelib/browser.py | 1 + Lib/idlelib/calltip_w.py | 1 + Lib/idlelib/config.py | 1 + Lib/idlelib/debugobj.py | 1 + Lib/idlelib/delegator.py | 1 + Lib/idlelib/dynoption.py | 2 ++ Lib/idlelib/editor.py | 1 + Lib/idlelib/filelist.py | 1 + Lib/idlelib/grep.py | 6 ++-- Lib/idlelib/help.py | 2 ++ Lib/idlelib/idle_test/htest.py | 60 +++++++++++++++++----------------- Lib/idlelib/iomenu.py | 2 ++ Lib/idlelib/multicall.py | 1 + Lib/idlelib/outwin.py | 1 + Lib/idlelib/percolator.py | 2 ++ Lib/idlelib/pyshell.py | 1 + Lib/idlelib/redirector.py | 1 + Lib/idlelib/replace.py | 1 + Lib/idlelib/scrolledlist.py | 1 + Lib/idlelib/search.py | 1 + Lib/idlelib/sidebar.py | 4 +-- Lib/idlelib/statusbar.py | 1 + Lib/idlelib/tree.py | 1 + Lib/idlelib/undo.py | 1 + Lib/idlelib/util.py | 1 + 25 files changed, 62 insertions(+), 34 deletions(-) diff --git a/Lib/idlelib/browser.py b/Lib/idlelib/browser.py index 672e229ffbca94..8b9060e57072ea 100644 --- a/Lib/idlelib/browser.py +++ b/Lib/idlelib/browser.py @@ -250,6 +250,7 @@ def closure(): class Nested_in_closure: pass ModuleBrowser(parent, file, _htest=True) + if __name__ == "__main__": if len(sys.argv) == 1: # If pass file on command line, unittest fails. from unittest import main diff --git a/Lib/idlelib/calltip_w.py b/Lib/idlelib/calltip_w.py index 278546064adde2..9386376058c791 100644 --- a/Lib/idlelib/calltip_w.py +++ b/Lib/idlelib/calltip_w.py @@ -193,6 +193,7 @@ def calltip_hide(event): text.focus_set() + if __name__ == '__main__': from unittest import main main('idlelib.idle_test.test_calltip_w', verbosity=2, exit=False) diff --git a/Lib/idlelib/config.py b/Lib/idlelib/config.py index 898efeb4dd1550..92992fd9cce9cd 100644 --- a/Lib/idlelib/config.py +++ b/Lib/idlelib/config.py @@ -906,6 +906,7 @@ def dumpCfg(cfg): dumpCfg(idleConf.userCfg) print('\nlines = ', line, ', crc = ', crc, sep='') + if __name__ == '__main__': from unittest import main main('idlelib.idle_test.test_config', verbosity=2, exit=False) diff --git a/Lib/idlelib/debugobj.py b/Lib/idlelib/debugobj.py index 0bf2cb1d5bbfe2..156377f8ed26ac 100644 --- a/Lib/idlelib/debugobj.py +++ b/Lib/idlelib/debugobj.py @@ -135,6 +135,7 @@ def _debug_object_browser(parent): # htest # node = TreeNode(sc.canvas, None, item) node.update() + if __name__ == '__main__': from unittest import main main('idlelib.idle_test.test_debugobj', verbosity=2, exit=False) diff --git a/Lib/idlelib/delegator.py b/Lib/idlelib/delegator.py index 55c95da8532f47..93ae8bbd43ff44 100644 --- a/Lib/idlelib/delegator.py +++ b/Lib/idlelib/delegator.py @@ -28,6 +28,7 @@ def setdelegate(self, delegate): self.resetcache() self.delegate = delegate + if __name__ == '__main__': from unittest import main main('idlelib.idle_test.test_delegator', verbosity=2) diff --git a/Lib/idlelib/dynoption.py b/Lib/idlelib/dynoption.py index d5dfc3eda13f60..b8937f7106ca75 100644 --- a/Lib/idlelib/dynoption.py +++ b/Lib/idlelib/dynoption.py @@ -29,6 +29,7 @@ def SetMenu(self,valueList,value=None): if value: self.variable.set(value) + def _dyn_option_menu(parent): # htest # from tkinter import Toplevel # + StringVar, Button @@ -49,6 +50,7 @@ def update(): button = Button(top, text="Change option set", command=update) button.pack() + if __name__ == '__main__': # Only module without unittests because of intention to replace. from idlelib.idle_test.htest import run diff --git a/Lib/idlelib/editor.py b/Lib/idlelib/editor.py index 69b27d0683a104..6ad383f460c7ee 100644 --- a/Lib/idlelib/editor.py +++ b/Lib/idlelib/editor.py @@ -1748,6 +1748,7 @@ def _editor_window(parent): # htest # # Does not stop error, neither does following # edit.text.bind("<>", edit.close_event) + if __name__ == '__main__': from unittest import main main('idlelib.idle_test.test_editor', verbosity=2, exit=False) diff --git a/Lib/idlelib/filelist.py b/Lib/idlelib/filelist.py index f87781d2570fe0..e27e5d32a0ff63 100644 --- a/Lib/idlelib/filelist.py +++ b/Lib/idlelib/filelist.py @@ -124,6 +124,7 @@ def _test(): # TODO check and convert to htest if flist.inversedict: root.mainloop() + if __name__ == '__main__': from unittest import main main('idlelib.idle_test.test_filelist', verbosity=2) diff --git a/Lib/idlelib/grep.py b/Lib/idlelib/grep.py index 12513594b76f8f..ef14349960bfa2 100644 --- a/Lib/idlelib/grep.py +++ b/Lib/idlelib/grep.py @@ -204,15 +204,17 @@ def _grep_dialog(parent): # htest # frame.pack() text = Text(frame, height=5) text.pack() + text.insert('1.0', 'import grep') def show_grep_dialog(): - text.tag_add(SEL, "1.0", END) + text.tag_add(SEL, "1.0", '1.end') grep(text, flist=flist) - text.tag_remove(SEL, "1.0", END) + text.tag_remove(SEL, "1.0", '1.end') button = Button(frame, text="Show GrepDialog", command=show_grep_dialog) button.pack() + if __name__ == "__main__": from unittest import main main('idlelib.idle_test.test_grep', verbosity=2, exit=False) diff --git a/Lib/idlelib/help.py b/Lib/idlelib/help.py index 580a327f620a79..3cc7e36e35555b 100644 --- a/Lib/idlelib/help.py +++ b/Lib/idlelib/help.py @@ -278,6 +278,7 @@ def copy_strip(): out.write(line.rstrip() + b'\n') print(f'{src} copied to {dst}') + def _helpwindow(parent): "Create HelpWindow; called from Idle Help event handler." filename = join(abspath(dirname(__file__)), 'help.html') @@ -286,6 +287,7 @@ def _helpwindow(parent): return HelpWindow(parent, filename, 'IDLE Help (%s)' % python_version()) + if __name__ == '__main__': from unittest import main main('idlelib.idle_test.test_help', verbosity=2, exit=False) diff --git a/Lib/idlelib/idle_test/htest.py b/Lib/idlelib/idle_test/htest.py index 4042106bf44a9f..997f85ff5a78b2 100644 --- a/Lib/idlelib/idle_test/htest.py +++ b/Lib/idlelib/idle_test/htest.py @@ -170,8 +170,8 @@ 'msg': "Click the 'Show GrepDialog' button.\n" "Test the various 'Find-in-files' functions.\n" "The results should be displayed in a new '*Output*' window.\n" - "'Right-click'->'Go to file/line' anywhere in the search results " - "should open that file \nin a new EditorWindow." + "'Right-click'->'Go to file/line' in the search results\n " + "should open that file in a new EditorWindow." } HelpSource_spec = { @@ -210,26 +210,6 @@ "Check that changes were saved by opening the file elsewhere." } -_linenumbers_drag_scrolling_spec = { - 'file': 'sidebar', - 'kwds': {}, - 'msg': textwrap.dedent("""\ - 1. Click on the line numbers and drag down below the edge of the - window, moving the mouse a bit and then leaving it there for a - while. The text and line numbers should gradually scroll down, - with the selection updated continuously. - - 2. With the lines still selected, click on a line number above - or below the selected lines. Only the line whose number was - clicked should be selected. - - 3. Repeat step #1, dragging to above the window. The text and - line numbers should gradually scroll up, with the selection - updated continuously. - - 4. Repeat step #2, clicking a line number below the selection."""), - } - _multi_call_spec = { 'file': 'multicall', 'kwds': {}, @@ -295,6 +275,15 @@ "Click [Close] or [X] to close the 'Replace Dialog'." } +_scrolled_list_spec = { + 'file': 'scrolledlist', + 'kwds': {}, + 'msg': "You should see a scrollable list of items\n" + "Selecting (clicking) or double clicking an item " + "prints the name to the console or Idle shell.\n" + "Right clicking an item will display a popup." + } + _search_dialog_spec = { 'file': 'search', 'kwds': {}, @@ -310,21 +299,31 @@ "Its only action is to close." } -_scrolled_list_spec = { - 'file': 'scrolledlist', +_sidebar_number_scrolling_spec = { + 'file': 'sidebar', 'kwds': {}, - 'msg': "You should see a scrollable list of items\n" - "Selecting (clicking) or double clicking an item " - "prints the name to the console or Idle shell.\n" - "Right clicking an item will display a popup." + 'msg': textwrap.dedent("""\ + 1. Click on the line numbers and drag down below the edge of the + window, moving the mouse a bit and then leaving it there for a + while. The text and line numbers should gradually scroll down, + with the selection updated continuously. + + 2. With the lines still selected, click on a line number above + or below the selected lines. Only the line whose number was + clicked should be selected. + + 3. Repeat step #1, dragging to above the window. The text and + line numbers should gradually scroll up, with the selection + updated continuously. + + 4. Repeat step #2, clicking a line number below the selection."""), } _stackbrowser_spec = { 'file': 'stackviewer', 'kwds': {}, 'msg': "A stacktrace for a NameError exception.\n" - "Expand 'idlelib ...' and ''.\n" - "Check that exc_value, exc_tb, and exc_type are correct.\n" + "Should have NameError and 1 traceback line." } _tooltip_spec = { @@ -438,5 +437,6 @@ def close(_=None): next_test() root.mainloop() + if __name__ == '__main__': run() diff --git a/Lib/idlelib/iomenu.py b/Lib/idlelib/iomenu.py index 7629101635b8bb..667623ec71ac98 100644 --- a/Lib/idlelib/iomenu.py +++ b/Lib/idlelib/iomenu.py @@ -393,6 +393,7 @@ def updaterecentfileslist(self,filename): if self.editwin.flist: self.editwin.update_recent_files_list(filename) + def _io_binding(parent): # htest # from tkinter import Toplevel, Text @@ -430,6 +431,7 @@ def savecopy(self, event): editwin = MyEditWin(text) IOBinding(editwin) + if __name__ == "__main__": from unittest import main main('idlelib.idle_test.test_iomenu', verbosity=2, exit=False) diff --git a/Lib/idlelib/multicall.py b/Lib/idlelib/multicall.py index 2aa4a54125156f..41f81813113062 100644 --- a/Lib/idlelib/multicall.py +++ b/Lib/idlelib/multicall.py @@ -442,6 +442,7 @@ def handler(event): bindseq("") bindseq("") + if __name__ == "__main__": from unittest import main main('idlelib.idle_test.test_mainmenu', verbosity=2, exit=False) diff --git a/Lib/idlelib/outwin.py b/Lib/idlelib/outwin.py index 610031e26f1dff..5ed3f35a7af655 100644 --- a/Lib/idlelib/outwin.py +++ b/Lib/idlelib/outwin.py @@ -182,6 +182,7 @@ def setup(self): text.tag_raise('sel') self.write = self.owin.write + if __name__ == '__main__': from unittest import main main('idlelib.idle_test.test_outwin', verbosity=2, exit=False) diff --git a/Lib/idlelib/percolator.py b/Lib/idlelib/percolator.py index 91ad7272f4ae56..aa73427c4915c8 100644 --- a/Lib/idlelib/percolator.py +++ b/Lib/idlelib/percolator.py @@ -103,6 +103,7 @@ def toggle2(): (pin if var2.get() else pout)(t2) text.pack() + text.focus_set() var1 = tk.IntVar(parent) cb1 = tk.Checkbutton(top, text="Tracer1", command=toggle1, variable=var1) cb1.pack() @@ -110,6 +111,7 @@ def toggle2(): cb2 = tk.Checkbutton(top, text="Tracer2", command=toggle2, variable=var2) cb2.pack() + if __name__ == "__main__": from unittest import main main('idlelib.idle_test.test_percolator', verbosity=2, exit=False) diff --git a/Lib/idlelib/pyshell.py b/Lib/idlelib/pyshell.py index 3c3d715e9894a8..037d6a352ff162 100755 --- a/Lib/idlelib/pyshell.py +++ b/Lib/idlelib/pyshell.py @@ -1694,6 +1694,7 @@ def main(): root.destroy() capture_warnings(False) + if __name__ == "__main__": main() diff --git a/Lib/idlelib/redirector.py b/Lib/idlelib/redirector.py index 4928340e98df68..08728956abd900 100644 --- a/Lib/idlelib/redirector.py +++ b/Lib/idlelib/redirector.py @@ -164,6 +164,7 @@ def my_insert(*args): original_insert(*args) original_insert = redir.register("insert", my_insert) + if __name__ == "__main__": from unittest import main main('idlelib.idle_test.test_redirector', verbosity=2, exit=False) diff --git a/Lib/idlelib/replace.py b/Lib/idlelib/replace.py index ca83173877ad1d..a29ca591427491 100644 --- a/Lib/idlelib/replace.py +++ b/Lib/idlelib/replace.py @@ -299,6 +299,7 @@ def show_replace(): button = Button(frame, text="Replace", command=show_replace) button.pack() + if __name__ == '__main__': from unittest import main main('idlelib.idle_test.test_replace', verbosity=2, exit=False) diff --git a/Lib/idlelib/scrolledlist.py b/Lib/idlelib/scrolledlist.py index 4f1241a576fca1..4fb418db326255 100644 --- a/Lib/idlelib/scrolledlist.py +++ b/Lib/idlelib/scrolledlist.py @@ -142,6 +142,7 @@ def on_double(self, index): print("double", self.get(index)) for i in range(30): scrolled_list.append("Item %02d" % i) + if __name__ == '__main__': from unittest import main main('idlelib.idle_test.test_scrolledlist', verbosity=2, exit=False) diff --git a/Lib/idlelib/search.py b/Lib/idlelib/search.py index b35f3b59c3d2e8..935a4832257fa4 100644 --- a/Lib/idlelib/search.py +++ b/Lib/idlelib/search.py @@ -156,6 +156,7 @@ def show_find(): button = Button(frame, text="Search (selection ignored)", command=show_find) button.pack() + if __name__ == '__main__': from unittest import main main('idlelib.idle_test.test_search', verbosity=2, exit=False) diff --git a/Lib/idlelib/sidebar.py b/Lib/idlelib/sidebar.py index 5dc17fce9cd720..f05a2b00623eb0 100644 --- a/Lib/idlelib/sidebar.py +++ b/Lib/idlelib/sidebar.py @@ -514,7 +514,7 @@ def update_colors(self): self.change_callback() -def _linenumbers_drag_scrolling(parent): # htest # +def _sidebar_number_scrolling(parent): # htest # from idlelib.idle_test.test_sidebar import Dummy_editwin top = tk.Toplevel(parent) @@ -541,4 +541,4 @@ def _linenumbers_drag_scrolling(parent): # htest # main('idlelib.idle_test.test_sidebar', verbosity=2, exit=False) from idlelib.idle_test.htest import run - run(_linenumbers_drag_scrolling) + run(_sidebar_number_scrolling) diff --git a/Lib/idlelib/statusbar.py b/Lib/idlelib/statusbar.py index 7048bd64b98753..8445d4cc8dfdb9 100644 --- a/Lib/idlelib/statusbar.py +++ b/Lib/idlelib/statusbar.py @@ -43,6 +43,7 @@ def change(): button.pack(side='bottom') frame.pack() + if __name__ == '__main__': from unittest import main main('idlelib.idle_test.test_statusbar', verbosity=2, exit=False) diff --git a/Lib/idlelib/tree.py b/Lib/idlelib/tree.py index 5f30f0f6092bfa..9c2eb47b24aec9 100644 --- a/Lib/idlelib/tree.py +++ b/Lib/idlelib/tree.py @@ -492,6 +492,7 @@ def _tree_widget(parent): # htest # node = TreeNode(sc.canvas, None, item) node.expand() + if __name__ == '__main__': from unittest import main main('idlelib.idle_test.test_tree', verbosity=2, exit=False) diff --git a/Lib/idlelib/undo.py b/Lib/idlelib/undo.py index f1d03f4c9ed5ef..f52446d5fcdcf8 100644 --- a/Lib/idlelib/undo.py +++ b/Lib/idlelib/undo.py @@ -358,6 +358,7 @@ def _undo_delegator(parent): # htest # dump = Button(top, text="Dump", command=lambda:d.dump_event(None)) dump.pack(side='left') + if __name__ == "__main__": from unittest import main main('idlelib.idle_test.test_undo', verbosity=2, exit=False) diff --git a/Lib/idlelib/util.py b/Lib/idlelib/util.py index ede670a4db5536..5ac69a7b94cb56 100644 --- a/Lib/idlelib/util.py +++ b/Lib/idlelib/util.py @@ -16,6 +16,7 @@ # .pyw is for Windows; .pyi is for stub files. py_extensions = ('.py', '.pyw', '.pyi') # Order needed for open/save dialogs. + if __name__ == '__main__': from unittest import main main('idlelib.idle_test.test_util', verbosity=2) From 3eea835d85814b3cda8e43a49e7718060e701195 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 4 Dec 2023 09:18:03 +0100 Subject: [PATCH 257/265] [3.11] gh-112678: Declare `Tkapp_CallDeallocArgs()` as `static` (GH-112679) (GH-112691) (cherry picked from commit 23e001fa9f1897ba3384c02bbbe634313358a549) Co-authored-by: Christopher Chavez --- Modules/_tkinter.c | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Modules/_tkinter.c b/Modules/_tkinter.c index e5377073ff765e..005036d3ff2e4e 100644 --- a/Modules/_tkinter.c +++ b/Modules/_tkinter.c @@ -1249,7 +1249,7 @@ typedef struct Tkapp_CallEvent { Tcl_Condition *done; } Tkapp_CallEvent; -void +static void Tkapp_CallDeallocArgs(Tcl_Obj** objv, Tcl_Obj** objStore, int objc) { int i; From 7e89dd99266c394b1fda6328eeca5f3fa3abb1b9 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 4 Dec 2023 09:37:25 +0100 Subject: [PATCH 258/265] [3.11] gh-112625: Protect bytearray from being freed by misbehaving iterator inside bytearray.join (GH-112626) (GH-112694) (cherry picked from commit 0e732d0997cff08855d98c17af4dd5527f10e419) Co-authored-by: chilaxan --- Lib/test/test_builtin.py | 17 +++++++++++++++++ ...23-12-03-19-34-51.gh-issue-112625.QWTlwS.rst | 1 + Objects/bytearrayobject.c | 5 ++++- 3 files changed, 22 insertions(+), 1 deletion(-) create mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-12-03-19-34-51.gh-issue-112625.QWTlwS.rst diff --git a/Lib/test/test_builtin.py b/Lib/test/test_builtin.py index d6edc2b431d32a..a34f6daaad6923 100644 --- a/Lib/test/test_builtin.py +++ b/Lib/test/test_builtin.py @@ -1987,6 +1987,23 @@ def test_bytearray_extend_error(self): bad_iter = map(int, "X") self.assertRaises(ValueError, array.extend, bad_iter) + def test_bytearray_join_with_misbehaving_iterator(self): + # Issue #112625 + array = bytearray(b',') + def iterator(): + array.clear() + yield b'A' + yield b'B' + self.assertRaises(BufferError, array.join, iterator()) + + def test_bytearray_join_with_custom_iterator(self): + # Issue #112625 + array = bytearray(b',') + def iterator(): + yield b'A' + yield b'B' + self.assertEqual(bytearray(b'A,B'), array.join(iterator())) + def test_construct_singletons(self): for const in None, Ellipsis, NotImplemented: tp = type(const) diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-12-03-19-34-51.gh-issue-112625.QWTlwS.rst b/Misc/NEWS.d/next/Core and Builtins/2023-12-03-19-34-51.gh-issue-112625.QWTlwS.rst new file mode 100644 index 00000000000000..4970e10f3f4dcb --- /dev/null +++ b/Misc/NEWS.d/next/Core and Builtins/2023-12-03-19-34-51.gh-issue-112625.QWTlwS.rst @@ -0,0 +1 @@ +Fixes a bug where a bytearray object could be cleared while iterating over an argument in the ``bytearray.join()`` method that could result in reading memory after it was freed. diff --git a/Objects/bytearrayobject.c b/Objects/bytearrayobject.c index 6d46ebe2a44808..1c502030842120 100644 --- a/Objects/bytearrayobject.c +++ b/Objects/bytearrayobject.c @@ -2008,7 +2008,10 @@ static PyObject * bytearray_join(PyByteArrayObject *self, PyObject *iterable_of_bytes) /*[clinic end generated code: output=a8516370bf68ae08 input=aba6b1f9b30fcb8e]*/ { - return stringlib_bytes_join((PyObject*)self, iterable_of_bytes); + self->ob_exports++; // this protects `self` from being cleared/resized if `iterable_of_bytes` is a custom iterator + PyObject* ret = stringlib_bytes_join((PyObject*)self, iterable_of_bytes); + self->ob_exports--; // unexport `self` + return ret; } /*[clinic input] From f60087836e9ea00be95764f12026896e0ac90777 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 4 Dec 2023 12:11:24 +0100 Subject: [PATCH 259/265] [3.11] gh-101100: Fix Sphinx nitpicks in `library/functions.rst` (GH-112669) (#112698) gh-101100: Fix Sphinx nitpicks in `library/functions.rst` (GH-112669) (cherry picked from commit cda737924fd616c4e08027888258f97e81f34447) Co-authored-by: Alex Waygood --- Doc/library/functions.rst | 113 ++++++++++++++++++++++---------------- Doc/tools/.nitignore | 1 - 2 files changed, 67 insertions(+), 47 deletions(-) diff --git a/Doc/library/functions.rst b/Doc/library/functions.rst index 0e5f36f57daf17..22e46be9c200f8 100644 --- a/Doc/library/functions.rst +++ b/Doc/library/functions.rst @@ -57,7 +57,8 @@ are always available. They are listed here in alphabetical order. .. function:: abs(x) Return the absolute value of a number. The argument may be an - integer, a floating point number, or an object implementing :meth:`__abs__`. + integer, a floating point number, or an object implementing + :meth:`~object.__abs__`. If the argument is a complex number, its magnitude is returned. @@ -235,7 +236,7 @@ are always available. They are listed here in alphabetical order. :const:`False` if not. If this returns ``True``, it is still possible that a call fails, but if it is ``False``, calling *object* will never succeed. Note that classes are callable (calling a class returns a new instance); - instances are callable if their class has a :meth:`__call__` method. + instances are callable if their class has a :meth:`~object.__call__` method. .. versionadded:: 3.2 This function was first removed in Python 3.0 and then brought back @@ -432,15 +433,18 @@ are always available. They are listed here in alphabetical order. Without arguments, return the list of names in the current local scope. With an argument, attempt to return a list of valid attributes for that object. - If the object has a method named :meth:`__dir__`, this method will be called and + If the object has a method named :meth:`~object.__dir__`, + this method will be called and must return the list of attributes. This allows objects that implement a custom - :func:`__getattr__` or :func:`__getattribute__` function to customize the way + :func:`~object.__getattr__` or :func:`~object.__getattribute__` function + to customize the way :func:`dir` reports their attributes. - If the object does not provide :meth:`__dir__`, the function tries its best to - gather information from the object's :attr:`~object.__dict__` attribute, if defined, and + If the object does not provide :meth:`~object.__dir__`, + the function tries its best to gather information from the object's + :attr:`~object.__dict__` attribute, if defined, and from its type object. The resulting list is not necessarily complete and may - be inaccurate when the object has a custom :func:`__getattr__`. + be inaccurate when the object has a custom :func:`~object.__getattr__`. The default :func:`dir` mechanism behaves differently with different types of objects, as it attempts to produce the most relevant, rather than complete, @@ -663,7 +667,7 @@ are always available. They are listed here in alphabetical order. sign: "+" | "-" infinity: "Infinity" | "inf" nan: "nan" - digitpart: `digit` (["_"] `digit`)* + digitpart: `!digit` (["_"] `!digit`)* number: [`digitpart`] "." `digitpart` | `digitpart` ["."] exponent: ("e" | "E") ["+" | "-"] `digitpart` floatnumber: number [`exponent`] @@ -726,8 +730,8 @@ are always available. They are listed here in alphabetical order. A call to ``format(value, format_spec)`` is translated to ``type(value).__format__(value, format_spec)`` which bypasses the instance - dictionary when searching for the value's :meth:`__format__` method. A - :exc:`TypeError` exception is raised if the method search reaches + dictionary when searching for the value's :meth:`~object.__format__` method. + A :exc:`TypeError` exception is raised if the method search reaches :mod:`object` and the *format_spec* is non-empty, or if either the *format_spec* or the return value are not strings. @@ -791,9 +795,9 @@ are always available. They are listed here in alphabetical order. .. note:: - For objects with custom :meth:`__hash__` methods, note that :func:`hash` + For objects with custom :meth:`~object.__hash__` methods, + note that :func:`hash` truncates the return value based on the bit width of the host machine. - See :meth:`__hash__ ` for details. .. function:: help() help(request) @@ -981,7 +985,8 @@ are always available. They are listed here in alphabetical order. Return an :term:`iterator` object. The first argument is interpreted very differently depending on the presence of the second argument. Without a second argument, *object* must be a collection object which supports the - :term:`iterable` protocol (the :meth:`__iter__` method), or it must support + :term:`iterable` protocol (the :meth:`~object.__iter__` method), + or it must support the sequence protocol (the :meth:`~object.__getitem__` method with integer arguments starting at ``0``). If it does not support either of those protocols, :exc:`TypeError` is raised. If the second argument, *sentinel*, is given, @@ -1499,38 +1504,44 @@ are always available. They are listed here in alphabetical order. """Get the current voltage.""" return self._voltage - The ``@property`` decorator turns the :meth:`voltage` method into a "getter" + The ``@property`` decorator turns the :meth:`!voltage` method into a "getter" for a read-only attribute with the same name, and it sets the docstring for *voltage* to "Get the current voltage." - A property object has :attr:`~property.getter`, :attr:`~property.setter`, - and :attr:`~property.deleter` methods usable as decorators that create a - copy of the property with the corresponding accessor function set to the - decorated function. This is best explained with an example:: + .. decorator:: property.getter + .. decorator:: property.setter + .. decorator:: property.deleter - class C: - def __init__(self): - self._x = None + A property object has ``getter``, ``setter``, + and ``deleter`` methods usable as decorators that create a + copy of the property with the corresponding accessor function set to the + decorated function. This is best explained with an example: - @property - def x(self): - """I'm the 'x' property.""" - return self._x + .. testcode:: - @x.setter - def x(self, value): - self._x = value + class C: + def __init__(self): + self._x = None - @x.deleter - def x(self): - del self._x + @property + def x(self): + """I'm the 'x' property.""" + return self._x - This code is exactly equivalent to the first example. Be sure to give the - additional functions the same name as the original property (``x`` in this - case.) + @x.setter + def x(self, value): + self._x = value - The returned property object also has the attributes ``fget``, ``fset``, and - ``fdel`` corresponding to the constructor arguments. + @x.deleter + def x(self): + del self._x + + This code is exactly equivalent to the first example. Be sure to give the + additional functions the same name as the original property (``x`` in this + case.) + + The returned property object also has the attributes ``fget``, ``fset``, and + ``fdel`` corresponding to the constructor arguments. .. versionchanged:: 3.5 The docstrings of property objects are now writeable. @@ -1553,7 +1564,8 @@ are always available. They are listed here in alphabetical order. representation is a string enclosed in angle brackets that contains the name of the type of the object together with additional information often including the name and address of the object. A class can control what this - function returns for its instances by defining a :meth:`__repr__` method. + function returns for its instances + by defining a :meth:`~object.__repr__` method. If :func:`sys.displayhook` is not accessible, this function will raise :exc:`RuntimeError`. @@ -1561,9 +1573,9 @@ are always available. They are listed here in alphabetical order. .. function:: reversed(seq) Return a reverse :term:`iterator`. *seq* must be an object which has - a :meth:`__reversed__` method or supports the sequence protocol (the - :meth:`__len__` method and the :meth:`~object.__getitem__` method with integer - arguments starting at ``0``). + a :meth:`~object.__reversed__` method or supports the sequence protocol (the + :meth:`~object.__len__` method and the :meth:`~object.__getitem__` method + with integer arguments starting at ``0``). .. function:: round(number, ndigits=None) @@ -1634,13 +1646,21 @@ are always available. They are listed here in alphabetical order. Return a :term:`slice` object representing the set of indices specified by ``range(start, stop, step)``. The *start* and *step* arguments default to - ``None``. Slice objects have read-only data attributes :attr:`~slice.start`, - :attr:`~slice.stop`, and :attr:`~slice.step` which merely return the argument - values (or their default). They have no other explicit functionality; - however, they are used by NumPy and other third-party packages. + ``None``. + + .. attribute:: slice.start + .. attribute:: slice.stop + .. attribute:: slice.step + + Slice objects have read-only data attributes :attr:`!start`, + :attr:`!stop`, and :attr:`!step` which merely return the argument + values (or their default). They have no other explicit functionality; + however, they are used by NumPy and other third-party packages. + Slice objects are also generated when extended indexing syntax is used. For example: ``a[start:stop:step]`` or ``a[start:stop, i]``. See - :func:`itertools.islice` for an alternate version that returns an iterator. + :func:`itertools.islice` for an alternate version that returns an + :term:`iterator`. .. function:: sorted(iterable, /, *, key=None, reverse=False) @@ -1800,7 +1820,8 @@ are always available. They are listed here in alphabetical order. Note that :func:`super` is implemented as part of the binding process for explicit dotted attribute lookups such as ``super().__getitem__(name)``. - It does so by implementing its own :meth:`__getattribute__` method for searching + It does so by implementing its own :meth:`~object.__getattribute__` method + for searching classes in a predictable order that supports cooperative multiple inheritance. Accordingly, :func:`super` is undefined for implicit lookups using statements or operators such as ``super()[name]``. diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index 9de2b233ea151f..642d1ee118c6f5 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -60,7 +60,6 @@ Doc/library/exceptions.rst Doc/library/faulthandler.rst Doc/library/fcntl.rst Doc/library/ftplib.rst -Doc/library/functions.rst Doc/library/functools.rst Doc/library/http.client.rst Doc/library/http.cookiejar.rst From 28afd8ddad8088dc3b251f18020b53b277205e5f Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 4 Dec 2023 12:54:39 +0100 Subject: [PATCH 260/265] [3.11] gh-101100: Fix sphinx warnings in `Doc/library/__future__.rst` (GH-109814) (#112701) gh-101100: Fix sphinx warnings in `Doc/library/__future__.rst` (GH-109814) (cherry picked from commit f2eaa92b0cc5a37a9e6010c7c6f5ad1a230ea49b) Co-authored-by: Nikita Sobolev --- Doc/library/__future__.rst | 58 +++++++++++++++++++++----------------- Doc/tools/.nitignore | 1 - 2 files changed, 32 insertions(+), 27 deletions(-) diff --git a/Doc/library/__future__.rst b/Doc/library/__future__.rst index 8bd23daee73977..d261e4a4f338a5 100644 --- a/Doc/library/__future__.rst +++ b/Doc/library/__future__.rst @@ -22,42 +22,48 @@ can be inspected programmatically via importing :mod:`__future__` and examining its contents. -Each statement in :file:`__future__.py` is of the form:: +.. _future-classes: - FeatureName = _Feature(OptionalRelease, MandatoryRelease, - CompilerFlag) +.. class:: _Feature + Each statement in :file:`__future__.py` is of the form:: -where, normally, *OptionalRelease* is less than *MandatoryRelease*, and both are -5-tuples of the same form as :data:`sys.version_info`:: + FeatureName = _Feature(OptionalRelease, MandatoryRelease, + CompilerFlag) - (PY_MAJOR_VERSION, # the 2 in 2.1.0a3; an int - PY_MINOR_VERSION, # the 1; an int - PY_MICRO_VERSION, # the 0; an int - PY_RELEASE_LEVEL, # "alpha", "beta", "candidate" or "final"; string - PY_RELEASE_SERIAL # the 3; an int - ) + where, normally, *OptionalRelease* is less than *MandatoryRelease*, and both are + 5-tuples of the same form as :data:`sys.version_info`:: -*OptionalRelease* records the first release in which the feature was accepted. + (PY_MAJOR_VERSION, # the 2 in 2.1.0a3; an int + PY_MINOR_VERSION, # the 1; an int + PY_MICRO_VERSION, # the 0; an int + PY_RELEASE_LEVEL, # "alpha", "beta", "candidate" or "final"; string + PY_RELEASE_SERIAL # the 3; an int + ) -In the case of a *MandatoryRelease* that has not yet occurred, -*MandatoryRelease* predicts the release in which the feature will become part of -the language. +.. method:: _Feature.getOptionalRelease() -Else *MandatoryRelease* records when the feature became part of the language; in -releases at or after that, modules no longer need a future statement to use the -feature in question, but may continue to use such imports. + *OptionalRelease* records the first release in which the feature was accepted. -*MandatoryRelease* may also be ``None``, meaning that a planned feature got -dropped. +.. method:: _Feature.getMandatoryRelease() -Instances of class :class:`_Feature` have two corresponding methods, -:meth:`getOptionalRelease` and :meth:`getMandatoryRelease`. + In the case of a *MandatoryRelease* that has not yet occurred, + *MandatoryRelease* predicts the release in which the feature will become part of + the language. -*CompilerFlag* is the (bitfield) flag that should be passed in the fourth -argument to the built-in function :func:`compile` to enable the feature in -dynamically compiled code. This flag is stored in the :attr:`compiler_flag` -attribute on :class:`_Feature` instances. + Else *MandatoryRelease* records when the feature became part of the language; in + releases at or after that, modules no longer need a future statement to use the + feature in question, but may continue to use such imports. + + *MandatoryRelease* may also be ``None``, meaning that a planned feature got + dropped or that it is not yet decided. + +.. attribute:: _Feature.compiler_flag + + *CompilerFlag* is the (bitfield) flag that should be passed in the fourth + argument to the built-in function :func:`compile` to enable the feature in + dynamically compiled code. This flag is stored in the :attr:`_Feature.compiler_flag` + attribute on :class:`_Feature` instances. No feature description will ever be deleted from :mod:`__future__`. Since its introduction in Python 2.1 the following features have found their way into the diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index 642d1ee118c6f5..ab47453b09cc26 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -26,7 +26,6 @@ Doc/howto/enum.rst Doc/howto/isolating-extensions.rst Doc/howto/logging.rst Doc/howto/urllib2.rst -Doc/library/__future__.rst Doc/library/abc.rst Doc/library/ast.rst Doc/library/asyncio-extending.rst From e3d380ded6efd25d60a21b7eb913142e86153e33 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 4 Dec 2023 13:20:19 +0100 Subject: [PATCH 261/265] [3.11] gh-109786: Fix leaks and crash when re-enter itertools.pairwise.__next__() (GH-109788) (GH-112700) (cherry picked from commit 6ca9d3e0173c38e2eac50367b187d4c1d43f9892) Co-authored-by: Serhiy Storchaka --- Lib/test/test_itertools.py | 72 +++++++++++++++++++ ...-09-23-14-40-51.gh-issue-109786.UX3pKv.rst | 2 + Modules/itertoolsmodule.c | 13 +++- 3 files changed, 85 insertions(+), 2 deletions(-) create mode 100644 Misc/NEWS.d/next/Library/2023-09-23-14-40-51.gh-issue-109786.UX3pKv.rst diff --git a/Lib/test/test_itertools.py b/Lib/test/test_itertools.py index 0d0030e4160995..c912ceba61a0b7 100644 --- a/Lib/test/test_itertools.py +++ b/Lib/test/test_itertools.py @@ -1079,6 +1079,78 @@ def test_pairwise(self): with self.assertRaises(TypeError): pairwise(None) # non-iterable argument + def test_pairwise_reenter(self): + def check(reenter_at, expected): + class I: + count = 0 + def __iter__(self): + return self + def __next__(self): + self.count +=1 + if self.count in reenter_at: + return next(it) + return [self.count] # new object + + it = pairwise(I()) + for item in expected: + self.assertEqual(next(it), item) + + check({1}, [ + (([2], [3]), [4]), + ([4], [5]), + ]) + check({2}, [ + ([1], ([1], [3])), + (([1], [3]), [4]), + ([4], [5]), + ]) + check({3}, [ + ([1], [2]), + ([2], ([2], [4])), + (([2], [4]), [5]), + ([5], [6]), + ]) + check({1, 2}, [ + ((([3], [4]), [5]), [6]), + ([6], [7]), + ]) + check({1, 3}, [ + (([2], ([2], [4])), [5]), + ([5], [6]), + ]) + check({1, 4}, [ + (([2], [3]), (([2], [3]), [5])), + ((([2], [3]), [5]), [6]), + ([6], [7]), + ]) + check({2, 3}, [ + ([1], ([1], ([1], [4]))), + (([1], ([1], [4])), [5]), + ([5], [6]), + ]) + + def test_pairwise_reenter2(self): + def check(maxcount, expected): + class I: + count = 0 + def __iter__(self): + return self + def __next__(self): + if self.count >= maxcount: + raise StopIteration + self.count +=1 + if self.count == 1: + return next(it, None) + return [self.count] # new object + + it = pairwise(I()) + self.assertEqual(list(it), expected) + + check(1, []) + check(2, []) + check(3, []) + check(4, [(([2], [3]), [4])]) + def test_product(self): for args, result in [ ([], [()]), # zero iterables diff --git a/Misc/NEWS.d/next/Library/2023-09-23-14-40-51.gh-issue-109786.UX3pKv.rst b/Misc/NEWS.d/next/Library/2023-09-23-14-40-51.gh-issue-109786.UX3pKv.rst new file mode 100644 index 00000000000000..07222fa339d703 --- /dev/null +++ b/Misc/NEWS.d/next/Library/2023-09-23-14-40-51.gh-issue-109786.UX3pKv.rst @@ -0,0 +1,2 @@ +Fix possible reference leaks and crash when re-enter the ``__next__()`` method of +:class:`itertools.pairwise`. diff --git a/Modules/itertoolsmodule.c b/Modules/itertoolsmodule.c index d5cdfc591e26f4..7a02a0d25b2062 100644 --- a/Modules/itertoolsmodule.c +++ b/Modules/itertoolsmodule.c @@ -119,21 +119,30 @@ pairwise_next(pairwiseobject *po) return NULL; } if (old == NULL) { - po->old = old = (*Py_TYPE(it)->tp_iternext)(it); + old = (*Py_TYPE(it)->tp_iternext)(it); + Py_XSETREF(po->old, old); if (old == NULL) { Py_CLEAR(po->it); return NULL; } + it = po->it; + if (it == NULL) { + Py_CLEAR(po->old); + return NULL; + } } + Py_INCREF(old); new = (*Py_TYPE(it)->tp_iternext)(it); if (new == NULL) { Py_CLEAR(po->it); Py_CLEAR(po->old); + Py_DECREF(old); return NULL; } /* Future optimization: Reuse the result tuple as we do in enumerate() */ result = PyTuple_Pack(2, old, new); - Py_SETREF(po->old, new); + Py_XSETREF(po->old, new); + Py_DECREF(old); return result; } From 798d43722b60fc1372848827944904590d26ed75 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 4 Dec 2023 13:48:58 +0100 Subject: [PATCH 262/265] [3.11] gh-101100: Fix Sphinx nitpicks in `library/abc.rst` (GH-112703) (#112704) gh-101100: Fix Sphinx nitpicks in `library/abc.rst` (GH-112703) (cherry picked from commit 9560e0d6d7a316341939b4016e47e03bd5bf17c3) Co-authored-by: Alex Waygood --- Doc/library/abc.rst | 42 +++++++++++++++++++++--------------------- Doc/tools/.nitignore | 1 - 2 files changed, 21 insertions(+), 22 deletions(-) diff --git a/Doc/library/abc.rst b/Doc/library/abc.rst index fb4f9da169c5ab..c073ea955abaa4 100644 --- a/Doc/library/abc.rst +++ b/Doc/library/abc.rst @@ -21,7 +21,7 @@ The :mod:`collections` module has some concrete classes that derive from ABCs; these can, of course, be further derived. In addition, the :mod:`collections.abc` submodule has some ABCs that can be used to test whether a class or instance provides a particular interface, for example, if it is -:term:`hashable` or if it is a mapping. +:term:`hashable` or if it is a :term:`mapping`. This module provides the metaclass :class:`ABCMeta` for defining ABCs and @@ -30,7 +30,7 @@ a helper class :class:`ABC` to alternatively define ABCs through inheritance: .. class:: ABC A helper class that has :class:`ABCMeta` as its metaclass. With this class, - an abstract base class can be created by simply deriving from :class:`ABC` + an abstract base class can be created by simply deriving from :class:`!ABC` avoiding sometimes confusing metaclass usage, for example:: from abc import ABC @@ -38,11 +38,11 @@ a helper class :class:`ABC` to alternatively define ABCs through inheritance: class MyABC(ABC): pass - Note that the type of :class:`ABC` is still :class:`ABCMeta`, therefore - inheriting from :class:`ABC` requires the usual precautions regarding + Note that the type of :class:`!ABC` is still :class:`ABCMeta`, therefore + inheriting from :class:`!ABC` requires the usual precautions regarding metaclass usage, as multiple inheritance may lead to metaclass conflicts. One may also define an abstract base class by passing the metaclass - keyword and using :class:`ABCMeta` directly, for example:: + keyword and using :class:`!ABCMeta` directly, for example:: from abc import ABCMeta @@ -65,7 +65,7 @@ a helper class :class:`ABC` to alternatively define ABCs through inheritance: implementations defined by the registering ABC be callable (not even via :func:`super`). [#]_ - Classes created with a metaclass of :class:`ABCMeta` have the following method: + Classes created with a metaclass of :class:`!ABCMeta` have the following method: .. method:: register(subclass) @@ -86,7 +86,7 @@ a helper class :class:`ABC` to alternatively define ABCs through inheritance: Returns the registered subclass, to allow usage as a class decorator. .. versionchanged:: 3.4 - To detect calls to :meth:`register`, you can use the + To detect calls to :meth:`!register`, you can use the :func:`get_cache_token` function. You can also override this method in an abstract base class: @@ -96,10 +96,10 @@ a helper class :class:`ABC` to alternatively define ABCs through inheritance: (Must be defined as a class method.) Check whether *subclass* is considered a subclass of this ABC. This means - that you can customize the behavior of ``issubclass`` further without the + that you can customize the behavior of :func:`issubclass` further without the need to call :meth:`register` on every class you want to consider a subclass of the ABC. (This class method is called from the - :meth:`__subclasscheck__` method of the ABC.) + :meth:`~class.__subclasscheck__` method of the ABC.) This method should return ``True``, ``False`` or ``NotImplemented``. If it returns ``True``, the *subclass* is considered a subclass of this ABC. @@ -142,7 +142,7 @@ a helper class :class:`ABC` to alternatively define ABCs through inheritance: The ABC ``MyIterable`` defines the standard iterable method, :meth:`~iterator.__iter__`, as an abstract method. The implementation given - here can still be called from subclasses. The :meth:`get_iterator` method + here can still be called from subclasses. The :meth:`!get_iterator` method is also part of the ``MyIterable`` abstract base class, but it does not have to be overridden in non-abstract derived classes. @@ -153,14 +153,14 @@ a helper class :class:`ABC` to alternatively define ABCs through inheritance: Finally, the last line makes ``Foo`` a virtual subclass of ``MyIterable``, even though it does not define an :meth:`~iterator.__iter__` method (it uses - the old-style iterable protocol, defined in terms of :meth:`__len__` and + the old-style iterable protocol, defined in terms of :meth:`~object.__len__` and :meth:`~object.__getitem__`). Note that this will not make ``get_iterator`` available as a method of ``Foo``, so it is provided separately. -The :mod:`abc` module also provides the following decorator: +The :mod:`!abc` module also provides the following decorator: .. decorator:: abstractmethod @@ -168,19 +168,19 @@ The :mod:`abc` module also provides the following decorator: Using this decorator requires that the class's metaclass is :class:`ABCMeta` or is derived from it. A class that has a metaclass derived from - :class:`ABCMeta` cannot be instantiated unless all of its abstract methods + :class:`!ABCMeta` cannot be instantiated unless all of its abstract methods and properties are overridden. The abstract methods can be called using any - of the normal 'super' call mechanisms. :func:`abstractmethod` may be used + of the normal 'super' call mechanisms. :func:`!abstractmethod` may be used to declare abstract methods for properties and descriptors. Dynamically adding abstract methods to a class, or attempting to modify the abstraction status of a method or class once it is created, are only supported using the :func:`update_abstractmethods` function. The - :func:`abstractmethod` only affects subclasses derived using regular - inheritance; "virtual subclasses" registered with the ABC's :meth:`register` - method are not affected. + :func:`!abstractmethod` only affects subclasses derived using regular + inheritance; "virtual subclasses" registered with the ABC's + :meth:`~ABCMeta.register` method are not affected. - When :func:`abstractmethod` is applied in combination with other method + When :func:`!abstractmethod` is applied in combination with other method descriptors, it should be applied as the innermost decorator, as shown in the following usage examples:: @@ -216,7 +216,7 @@ The :mod:`abc` module also provides the following decorator: In order to correctly interoperate with the abstract base class machinery, the descriptor must identify itself as abstract using - :attr:`__isabstractmethod__`. In general, this attribute should be ``True`` + :attr:`!__isabstractmethod__`. In general, this attribute should be ``True`` if any of the methods used to compose the descriptor are abstract. For example, Python's built-in :class:`property` does the equivalent of:: @@ -236,7 +236,7 @@ The :mod:`abc` module also provides the following decorator: super-call in a framework that uses cooperative multiple-inheritance. -The :mod:`abc` module also supports the following legacy decorators: +The :mod:`!abc` module also supports the following legacy decorators: .. decorator:: abstractclassmethod @@ -323,7 +323,7 @@ The :mod:`abc` module also supports the following legacy decorators: ... -The :mod:`abc` module also provides the following functions: +The :mod:`!abc` module also provides the following functions: .. function:: get_cache_token() diff --git a/Doc/tools/.nitignore b/Doc/tools/.nitignore index ab47453b09cc26..063372e93df168 100644 --- a/Doc/tools/.nitignore +++ b/Doc/tools/.nitignore @@ -26,7 +26,6 @@ Doc/howto/enum.rst Doc/howto/isolating-extensions.rst Doc/howto/logging.rst Doc/howto/urllib2.rst -Doc/library/abc.rst Doc/library/ast.rst Doc/library/asyncio-extending.rst Doc/library/asyncio-policy.rst From 76c9c9cedf30c92f52c33852efeefdb232db4ac0 Mon Sep 17 00:00:00 2001 From: Seth Michael Larson Date: Mon, 4 Dec 2023 10:02:45 -0600 Subject: [PATCH 263/265] [3.11] GH-112160: Pin to manifest of quay.io/tiran/cpython_autoconf (#112161) --- Makefile.pre.in | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/Makefile.pre.in b/Makefile.pre.in index 21aecd4161adb2..c61b093fc9399a 100644 --- a/Makefile.pre.in +++ b/Makefile.pre.in @@ -2405,7 +2405,9 @@ regen-configure: @if command -v podman >/dev/null; then RUNTIME="podman"; else RUNTIME="docker"; fi; \ if ! command -v $$RUNTIME; then echo "$@ needs either Podman or Docker container runtime." >&2; exit 1; fi; \ if command -v selinuxenabled >/dev/null && selinuxenabled; then OPT=":Z"; fi; \ - CMD="$$RUNTIME run --rm --pull=always -v $(abs_srcdir):/src$$OPT quay.io/tiran/cpython_autoconf:269"; \ + # Manifest corresponds with tag '269' \ + CPYTHON_AUTOCONF_MANIFEST="sha256:f370fee95eefa3d57b00488bce4911635411fa83e2d293ced8cf8a3674ead939" \ + CMD="$$RUNTIME run --rm --pull=missing -v $(abs_srcdir):/src$$OPT quay.io/tiran/cpython_autoconf@$$CPYTHON_AUTOCONF_MANIFEST"; \ echo $$CMD; \ $$CMD || exit $? From 8dbda1cf3eab8696dfbb95f0ca60af8db39d51c4 Mon Sep 17 00:00:00 2001 From: "Miss Islington (bot)" <31488909+miss-islington@users.noreply.github.com> Date: Mon, 4 Dec 2023 17:16:03 +0100 Subject: [PATCH 264/265] =?UTF-8?q?[3.11]=20gh-108927:=20Fix=20removing=20?= =?UTF-8?q?testing=20modules=20from=20sys.modules=20(GH-108952)=20(=D0=9F?= =?UTF-8?q?=D0=A0-112712)?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit It breaks import machinery if the test module has submodules used in other tests. (cherry picked from commit e08b70fab1fbc45fa498020aac522ae1d5da6136) Co-authored-by: Serhiy Storchaka --- Lib/test/libregrtest/main.py | 18 +++++++++++++----- Lib/test/libregrtest/single.py | 4 ---- .../import_from_tests/test_regrtest_a.py | 11 +++++++++++ .../test_regrtest_b/__init__.py | 9 +++++++++ .../import_from_tests/test_regrtest_b/util.py | 0 .../import_from_tests/test_regrtest_c.py | 11 +++++++++++ Lib/test/test_regrtest.py | 19 +++++++++++++++++++ ...-09-05-20-46-35.gh-issue-108927.TpwWav.rst | 4 ++++ 8 files changed, 67 insertions(+), 9 deletions(-) create mode 100644 Lib/test/regrtestdata/import_from_tests/test_regrtest_a.py create mode 100644 Lib/test/regrtestdata/import_from_tests/test_regrtest_b/__init__.py create mode 100644 Lib/test/regrtestdata/import_from_tests/test_regrtest_b/util.py create mode 100644 Lib/test/regrtestdata/import_from_tests/test_regrtest_c.py create mode 100644 Misc/NEWS.d/next/Tests/2023-09-05-20-46-35.gh-issue-108927.TpwWav.rst diff --git a/Lib/test/libregrtest/main.py b/Lib/test/libregrtest/main.py index beee0fa09500e1..a9725fa967370a 100644 --- a/Lib/test/libregrtest/main.py +++ b/Lib/test/libregrtest/main.py @@ -306,7 +306,7 @@ def run_tests_sequentially(self, runtests): else: tracer = None - save_modules = sys.modules.keys() + save_modules = set(sys.modules) jobs = runtests.get_jobs() if jobs is not None: @@ -330,10 +330,18 @@ def run_tests_sequentially(self, runtests): result = self.run_test(test_name, runtests, tracer) - # Unload the newly imported modules (best effort finalization) - for module in sys.modules.keys(): - if module not in save_modules and module.startswith("test."): - support.unload(module) + # Unload the newly imported test modules (best effort finalization) + new_modules = [module for module in sys.modules + if module not in save_modules and + module.startswith(("test.", "test_"))] + for module in new_modules: + sys.modules.pop(module, None) + # Remove the attribute of the parent module. + parent, _, name = module.rpartition('.') + try: + delattr(sys.modules[parent], name) + except (KeyError, AttributeError): + pass if result.must_stop(self.fail_fast, self.fail_env_changed): break diff --git a/Lib/test/libregrtest/single.py b/Lib/test/libregrtest/single.py index eafeb5fe26f3f3..235029d8620ff5 100644 --- a/Lib/test/libregrtest/single.py +++ b/Lib/test/libregrtest/single.py @@ -122,10 +122,6 @@ def _load_run_test(result: TestResult, runtests: RunTests) -> None: # Load the test module and run the tests. test_name = result.test_name module_name = abs_module_name(test_name, runtests.test_dir) - - # Remove the module from sys.module to reload it if it was already imported - sys.modules.pop(module_name, None) - test_mod = importlib.import_module(module_name) if hasattr(test_mod, "test_main"): diff --git a/Lib/test/regrtestdata/import_from_tests/test_regrtest_a.py b/Lib/test/regrtestdata/import_from_tests/test_regrtest_a.py new file mode 100644 index 00000000000000..9c3d0c7cf4bfaa --- /dev/null +++ b/Lib/test/regrtestdata/import_from_tests/test_regrtest_a.py @@ -0,0 +1,11 @@ +import sys +import unittest +import test_regrtest_b.util + +class Test(unittest.TestCase): + def test(self): + test_regrtest_b.util # does not fail + self.assertIn('test_regrtest_a', sys.modules) + self.assertIs(sys.modules['test_regrtest_b'], test_regrtest_b) + self.assertIs(sys.modules['test_regrtest_b.util'], test_regrtest_b.util) + self.assertNotIn('test_regrtest_c', sys.modules) diff --git a/Lib/test/regrtestdata/import_from_tests/test_regrtest_b/__init__.py b/Lib/test/regrtestdata/import_from_tests/test_regrtest_b/__init__.py new file mode 100644 index 00000000000000..3dfba253455ad2 --- /dev/null +++ b/Lib/test/regrtestdata/import_from_tests/test_regrtest_b/__init__.py @@ -0,0 +1,9 @@ +import sys +import unittest + +class Test(unittest.TestCase): + def test(self): + self.assertNotIn('test_regrtest_a', sys.modules) + self.assertIn('test_regrtest_b', sys.modules) + self.assertNotIn('test_regrtest_b.util', sys.modules) + self.assertNotIn('test_regrtest_c', sys.modules) diff --git a/Lib/test/regrtestdata/import_from_tests/test_regrtest_b/util.py b/Lib/test/regrtestdata/import_from_tests/test_regrtest_b/util.py new file mode 100644 index 00000000000000..e69de29bb2d1d6 diff --git a/Lib/test/regrtestdata/import_from_tests/test_regrtest_c.py b/Lib/test/regrtestdata/import_from_tests/test_regrtest_c.py new file mode 100644 index 00000000000000..de80769118d709 --- /dev/null +++ b/Lib/test/regrtestdata/import_from_tests/test_regrtest_c.py @@ -0,0 +1,11 @@ +import sys +import unittest +import test_regrtest_b.util + +class Test(unittest.TestCase): + def test(self): + test_regrtest_b.util # does not fail + self.assertNotIn('test_regrtest_a', sys.modules) + self.assertIs(sys.modules['test_regrtest_b'], test_regrtest_b) + self.assertIs(sys.modules['test_regrtest_b.util'], test_regrtest_b.util) + self.assertIn('test_regrtest_c', sys.modules) diff --git a/Lib/test/test_regrtest.py b/Lib/test/test_regrtest.py index 2f1bb03bc0ba4e..2ab6f6a9862d6f 100644 --- a/Lib/test/test_regrtest.py +++ b/Lib/test/test_regrtest.py @@ -2021,6 +2021,25 @@ def test_dev_mode(self): self.check_executed_tests(output, tests, stats=len(tests), parallel=True) + def test_unload_tests(self): + # Test that unloading test modules does not break tests + # that import from other tests. + # The test execution order matters for this test. + # Both test_regrtest_a and test_regrtest_c which are executed before + # and after test_regrtest_b import a submodule from the test_regrtest_b + # package and use it in testing. test_regrtest_b itself does not import + # that submodule. + # Previously test_regrtest_c failed because test_regrtest_b.util in + # sys.modules was left after test_regrtest_a (making the import + # statement no-op), but new test_regrtest_b without the util attribute + # was imported for test_regrtest_b. + testdir = os.path.join(os.path.dirname(__file__), + 'regrtestdata', 'import_from_tests') + tests = [f'test_regrtest_{name}' for name in ('a', 'b', 'c')] + args = ['-Wd', '-E', '-bb', '-m', 'test', '--testdir=%s' % testdir, *tests] + output = self.run_python(args) + self.check_executed_tests(output, tests, stats=3) + def check_add_python_opts(self, option): # --fast-ci and --slow-ci add "-u -W default -bb -E" options to Python code = textwrap.dedent(r""" diff --git a/Misc/NEWS.d/next/Tests/2023-09-05-20-46-35.gh-issue-108927.TpwWav.rst b/Misc/NEWS.d/next/Tests/2023-09-05-20-46-35.gh-issue-108927.TpwWav.rst new file mode 100644 index 00000000000000..b1a78370afedb2 --- /dev/null +++ b/Misc/NEWS.d/next/Tests/2023-09-05-20-46-35.gh-issue-108927.TpwWav.rst @@ -0,0 +1,4 @@ +Fixed order dependence in running tests in the same process +when a test that has submodules (e.g. test_importlib) follows a test that +imports its submodule (e.g. test_importlib.util) and precedes a test +(e.g. test_unittest or test_compileall) that uses that submodule. From fa7a6f23036537567592647d15f043722c7144ad Mon Sep 17 00:00:00 2001 From: Pablo Galindo Date: Mon, 4 Dec 2023 17:54:29 +0000 Subject: [PATCH 265/265] Python 3.11.7 --- Include/patchlevel.h | 4 +- Lib/pydoc_data/topics.py | 229 +++-- Misc/NEWS.d/3.11.7.rst | 795 ++++++++++++++++++ ...-10-05-11-46-20.gh-issue-109191.imUkVN.rst | 1 - ...-10-06-02-15-23.gh-issue-103053.--7JUF.rst | 3 - ...-09-17-21-47-31.gh-issue-109521.JDF6i9.rst | 5 - ...-11-27-09-44-16.gh-issue-112438.GdNZiI.rst | 2 - ...-09-11-12-41-42.gh-issue-109216.60QOSb.rst | 2 - ...-10-02-23-17-08.gh-issue-110237._Xub0z.rst | 1 - ...-10-11-13-46-14.gh-issue-110696.J9kSzr.rst | 2 - ...3-10-23-22-11-09.gh-issue-94438.y2pITu.rst | 1 - ...3-10-26-15-34-11.gh-issue-88116.W9-vaQ.rst | 3 - ...-10-27-11-51-40.gh-issue-111380.vgSbir.rst | 2 - ...-10-27-12-17-49.gh-issue-111366._TSknV.rst | 3 - ...-10-27-19-38-33.gh-issue-102388.vd5YUZ.rst | 1 - ...-10-31-14-25-21.gh-issue-109181.11h6Mc.rst | 2 - ...-11-19-15-57-23.gh-issue-112266.BSJMbR.rst | 2 - ...-11-25-22-39-44.gh-issue-112387.AbBq5W.rst | 2 - ...-11-25-22-58-49.gh-issue-112388.MU3cIM.rst | 2 - ...-12-03-19-34-51.gh-issue-112625.QWTlwS.rst | 1 - ...-09-03-13-43-49.gh-issue-108826.KG7abS.rst | 1 - .../2019-01-07-06-18-25.bpo-35668.JimxP5.rst | 4 - .../2018-11-08-18-44-04.bpo-35191.s29bWu.rst | 2 - .../2020-05-21-23-32-46.bpo-40262.z4fQv1.rst | 2 - .../2020-07-28-20-48-05.bpo-41422.iMwnMu.rst | 2 - ...2-05-06-15-49-57.gh-issue-86826.rf006W.rst | 4 - ...2-05-28-20-55-07.gh-issue-73561.YRmAvy.rst | 1 - ...-09-02-16-07-23.gh-issue-108791.fBcAqh.rst | 1 - ...-09-23-14-40-51.gh-issue-109786.UX3pKv.rst | 2 - ...-09-25-20-05-41.gh-issue-109747._cRJH8.rst | 3 - ...-10-02-05-23-27.gh-issue-110196.djwt0z.rst | 1 - ...-10-04-18-56-29.gh-issue-110365.LCxiau.rst | 2 - ...-10-07-13-50-12.gh-issue-110378.Y4L8fl.rst | 3 - ...-10-08-18-15-02.gh-issue-110519.RDGe8-.rst | 3 - ...3-10-09-19-09-32.gh-issue-65052.C2mRlo.rst | 1 - ...-10-10-10-46-55.gh-issue-110590.fatz-h.rst | 3 - ...-10-19-22-46-34.gh-issue-111092.hgut12.rst | 1 - ...-10-20-15-29-10.gh-issue-110910.u2oPwX.rst | 3 - ...-10-21-13-57-06.gh-issue-111159.GoHp7s.rst | 1 - ...-10-22-21-28-05.gh-issue-111187._W11Ab.rst | 1 - ...-10-23-13-53-58.gh-issue-111174.Oohmzd.rst | 2 - ...-10-24-12-09-46.gh-issue-111251.urFYtn.rst | 1 - ...3-10-27-12-46-56.gh-issue-68166.0EbWW4.rst | 4 - ...-10-30-08-50-46.gh-issue-111356.Bc8LvA.rst | 1 - ...-10-31-07-46-56.gh-issue-111531.6zUV_G.rst | 2 - ...-11-01-14-03-24.gh-issue-110894.7-wZxC.rst | 1 - ...-11-04-10-24-25.gh-issue-111541.x0RBI1.rst | 1 - ...-11-08-11-50-49.gh-issue-111841.iSqdQf.rst | 2 - ...-11-08-15-58-57.gh-issue-111804.uAXTOL.rst | 2 - ...-11-10-22-08-28.gh-issue-111942.MDFm6v.rst | 2 - ...-11-11-16-42-48.gh-issue-109538.cMG5ux.rst | 1 - ...-11-14-18-43-55.gh-issue-111942.x1pnrj.rst | 2 - ...-11-15-04-53-37.gh-issue-112105.I3RcVN.rst | 1 - ...3-11-24-21-00-24.gh-issue-94722.GMIQIn.rst | 2 - ...-11-28-20-01-33.gh-issue-112509.QtoKed.rst | 3 - ...-12-02-12-55-17.gh-issue-112618.7_FT8-.rst | 2 - ...-09-05-20-46-35.gh-issue-108927.TpwWav.rst | 4 - ...-09-13-05-58-09.gh-issue-104736.lA25Fu.rst | 4 - ...-09-28-12-25-19.gh-issue-109972.GYnwIP.rst | 2 - ...-09-29-00-19-21.gh-issue-109974.Sh_g-r.rst | 3 - ...-10-03-10-54-09.gh-issue-110267.O-c47G.rst | 2 - ...3-10-05-13-46-50.gh-issue-81002.bOcuV6.rst | 1 - ...-10-05-14-22-48.gh-issue-110388.1-HQJO.rst | 1 - ...-10-05-19-33-49.gh-issue-110167.mIdj3v.rst | 5 - ...-10-06-02-32-18.gh-issue-103053.VfxBLI.rst | 3 - ...-10-10-23-20-13.gh-issue-110647.jKG3sY.rst | 2 - ...-10-16-13-47-24.gh-issue-110918.aFgZK3.rst | 4 - ...-10-17-17-54-36.gh-issue-110995.Fx8KRD.rst | 2 - ...-10-21-00-10-36.gh-issue-110932.jktjJU.rst | 2 - ...-10-21-19-27-36.gh-issue-111165.FU6mUk.rst | 2 - ...-10-25-13-13-30.gh-issue-111309.Re7orL.rst | 1 - ...-10-31-22-09-25.gh-issue-110367.UhQi44.rst | 3 - ...-10-05-15-23-23.gh-issue-109286.N8OzMg.rst | 1 - ...-10-06-14-20-14.gh-issue-110437.xpYy9q.rst | 2 - ...-10-19-21-46-18.gh-issue-110913.CWlPfg.rst | 1 - ...3-05-21-23-54-52.gh-issue-99834.6ANPts.rst | 1 - ...3-08-30-16-33-57.gh-issue-92603.ATkKVO.rst | 3 - ...3-09-02-08-49-57.gh-issue-71383.Ttkchg.rst | 2 - ...-10-04-23-38-24.gh-issue-109286.1ZLMaq.rst | 1 - ...-10-18-01-40-36.gh-issue-111015.NaLI2L.rst | 1 - ...-10-18-17-26-36.gh-issue-110950.sonoma.rst | 3 - README.rst | 2 +- 82 files changed, 910 insertions(+), 281 deletions(-) create mode 100644 Misc/NEWS.d/3.11.7.rst delete mode 100644 Misc/NEWS.d/next/Build/2023-10-05-11-46-20.gh-issue-109191.imUkVN.rst delete mode 100644 Misc/NEWS.d/next/Build/2023-10-06-02-15-23.gh-issue-103053.--7JUF.rst delete mode 100644 Misc/NEWS.d/next/C API/2023-09-17-21-47-31.gh-issue-109521.JDF6i9.rst delete mode 100644 Misc/NEWS.d/next/C API/2023-11-27-09-44-16.gh-issue-112438.GdNZiI.rst delete mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-09-11-12-41-42.gh-issue-109216.60QOSb.rst delete mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-02-23-17-08.gh-issue-110237._Xub0z.rst delete mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-11-13-46-14.gh-issue-110696.J9kSzr.rst delete mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-23-22-11-09.gh-issue-94438.y2pITu.rst delete mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-26-15-34-11.gh-issue-88116.W9-vaQ.rst delete mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-27-11-51-40.gh-issue-111380.vgSbir.rst delete mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-27-12-17-49.gh-issue-111366._TSknV.rst delete mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-27-19-38-33.gh-issue-102388.vd5YUZ.rst delete mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-10-31-14-25-21.gh-issue-109181.11h6Mc.rst delete mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-11-19-15-57-23.gh-issue-112266.BSJMbR.rst delete mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-39-44.gh-issue-112387.AbBq5W.rst delete mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-58-49.gh-issue-112388.MU3cIM.rst delete mode 100644 Misc/NEWS.d/next/Core and Builtins/2023-12-03-19-34-51.gh-issue-112625.QWTlwS.rst delete mode 100644 Misc/NEWS.d/next/Documentation/2023-09-03-13-43-49.gh-issue-108826.KG7abS.rst delete mode 100644 Misc/NEWS.d/next/IDLE/2019-01-07-06-18-25.bpo-35668.JimxP5.rst delete mode 100644 Misc/NEWS.d/next/Library/2018-11-08-18-44-04.bpo-35191.s29bWu.rst delete mode 100644 Misc/NEWS.d/next/Library/2020-05-21-23-32-46.bpo-40262.z4fQv1.rst delete mode 100644 Misc/NEWS.d/next/Library/2020-07-28-20-48-05.bpo-41422.iMwnMu.rst delete mode 100644 Misc/NEWS.d/next/Library/2022-05-06-15-49-57.gh-issue-86826.rf006W.rst delete mode 100644 Misc/NEWS.d/next/Library/2022-05-28-20-55-07.gh-issue-73561.YRmAvy.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-09-02-16-07-23.gh-issue-108791.fBcAqh.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-09-23-14-40-51.gh-issue-109786.UX3pKv.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-09-25-20-05-41.gh-issue-109747._cRJH8.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-10-02-05-23-27.gh-issue-110196.djwt0z.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-10-04-18-56-29.gh-issue-110365.LCxiau.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-10-07-13-50-12.gh-issue-110378.Y4L8fl.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-10-08-18-15-02.gh-issue-110519.RDGe8-.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-10-09-19-09-32.gh-issue-65052.C2mRlo.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-10-10-10-46-55.gh-issue-110590.fatz-h.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-10-19-22-46-34.gh-issue-111092.hgut12.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-10-20-15-29-10.gh-issue-110910.u2oPwX.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-10-21-13-57-06.gh-issue-111159.GoHp7s.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-10-22-21-28-05.gh-issue-111187._W11Ab.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-10-23-13-53-58.gh-issue-111174.Oohmzd.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-10-24-12-09-46.gh-issue-111251.urFYtn.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-10-27-12-46-56.gh-issue-68166.0EbWW4.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-10-30-08-50-46.gh-issue-111356.Bc8LvA.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-10-31-07-46-56.gh-issue-111531.6zUV_G.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-11-01-14-03-24.gh-issue-110894.7-wZxC.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-11-04-10-24-25.gh-issue-111541.x0RBI1.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-11-08-11-50-49.gh-issue-111841.iSqdQf.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-11-08-15-58-57.gh-issue-111804.uAXTOL.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-11-10-22-08-28.gh-issue-111942.MDFm6v.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-11-11-16-42-48.gh-issue-109538.cMG5ux.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-11-14-18-43-55.gh-issue-111942.x1pnrj.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-11-15-04-53-37.gh-issue-112105.I3RcVN.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-11-24-21-00-24.gh-issue-94722.GMIQIn.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-11-28-20-01-33.gh-issue-112509.QtoKed.rst delete mode 100644 Misc/NEWS.d/next/Library/2023-12-02-12-55-17.gh-issue-112618.7_FT8-.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-09-05-20-46-35.gh-issue-108927.TpwWav.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-09-13-05-58-09.gh-issue-104736.lA25Fu.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-09-28-12-25-19.gh-issue-109972.GYnwIP.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-09-29-00-19-21.gh-issue-109974.Sh_g-r.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-10-03-10-54-09.gh-issue-110267.O-c47G.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-10-05-13-46-50.gh-issue-81002.bOcuV6.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-10-05-14-22-48.gh-issue-110388.1-HQJO.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-10-05-19-33-49.gh-issue-110167.mIdj3v.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-10-06-02-32-18.gh-issue-103053.VfxBLI.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-10-10-23-20-13.gh-issue-110647.jKG3sY.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-10-16-13-47-24.gh-issue-110918.aFgZK3.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-10-17-17-54-36.gh-issue-110995.Fx8KRD.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-10-21-00-10-36.gh-issue-110932.jktjJU.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-10-21-19-27-36.gh-issue-111165.FU6mUk.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-10-25-13-13-30.gh-issue-111309.Re7orL.rst delete mode 100644 Misc/NEWS.d/next/Tests/2023-10-31-22-09-25.gh-issue-110367.UhQi44.rst delete mode 100644 Misc/NEWS.d/next/Windows/2023-10-05-15-23-23.gh-issue-109286.N8OzMg.rst delete mode 100644 Misc/NEWS.d/next/Windows/2023-10-06-14-20-14.gh-issue-110437.xpYy9q.rst delete mode 100644 Misc/NEWS.d/next/Windows/2023-10-19-21-46-18.gh-issue-110913.CWlPfg.rst delete mode 100644 Misc/NEWS.d/next/macOS/2023-05-21-23-54-52.gh-issue-99834.6ANPts.rst delete mode 100644 Misc/NEWS.d/next/macOS/2023-08-30-16-33-57.gh-issue-92603.ATkKVO.rst delete mode 100644 Misc/NEWS.d/next/macOS/2023-09-02-08-49-57.gh-issue-71383.Ttkchg.rst delete mode 100644 Misc/NEWS.d/next/macOS/2023-10-04-23-38-24.gh-issue-109286.1ZLMaq.rst delete mode 100644 Misc/NEWS.d/next/macOS/2023-10-18-01-40-36.gh-issue-111015.NaLI2L.rst delete mode 100644 Misc/NEWS.d/next/macOS/2023-10-18-17-26-36.gh-issue-110950.sonoma.rst diff --git a/Include/patchlevel.h b/Include/patchlevel.h index 58a3297d17b872..7acea9370345ce 100644 --- a/Include/patchlevel.h +++ b/Include/patchlevel.h @@ -18,12 +18,12 @@ /*--start constants--*/ #define PY_MAJOR_VERSION 3 #define PY_MINOR_VERSION 11 -#define PY_MICRO_VERSION 6 +#define PY_MICRO_VERSION 7 #define PY_RELEASE_LEVEL PY_RELEASE_LEVEL_FINAL #define PY_RELEASE_SERIAL 0 /* Version as a string */ -#define PY_VERSION "3.11.6+" +#define PY_VERSION "3.11.7" /*--end constants--*/ /* Version as a single 4-byte hex number, e.g. 0x010502B2 == 1.5.2b2. diff --git a/Lib/pydoc_data/topics.py b/Lib/pydoc_data/topics.py index 06422d21c6b944..9cdf477d44728e 100644 --- a/Lib/pydoc_data/topics.py +++ b/Lib/pydoc_data/topics.py @@ -1,5 +1,5 @@ # -*- coding: utf-8 -*- -# Autogenerated by Sphinx on Mon Oct 2 14:27:48 2023 +# Autogenerated by Sphinx on Mon Dec 4 17:55:07 2023 # as part of the release process. topics = {'assert': 'The "assert" statement\n' '**********************\n' @@ -552,30 +552,30 @@ '[2] In pattern matching, a sequence is defined as one of the\n' ' following:\n' '\n' - ' * a class that inherits from "collections.abc.Sequence"\n' + ' * a class that inherits from "collections.abc.Sequence"\n' '\n' - ' * a Python class that has been registered as\n' - ' "collections.abc.Sequence"\n' + ' * a Python class that has been registered as\n' + ' "collections.abc.Sequence"\n' '\n' - ' * a builtin class that has its (CPython) ' - '"Py_TPFLAGS_SEQUENCE"\n' - ' bit set\n' + ' * a builtin class that has its (CPython) "Py_TPFLAGS_SEQUENCE" ' + 'bit\n' + ' set\n' '\n' - ' * a class that inherits from any of the above\n' + ' * a class that inherits from any of the above\n' '\n' ' The following standard library classes are sequences:\n' '\n' - ' * "array.array"\n' + ' * "array.array"\n' '\n' - ' * "collections.deque"\n' + ' * "collections.deque"\n' '\n' - ' * "list"\n' + ' * "list"\n' '\n' - ' * "memoryview"\n' + ' * "memoryview"\n' '\n' - ' * "range"\n' + ' * "range"\n' '\n' - ' * "tuple"\n' + ' * "tuple"\n' '\n' ' Note:\n' '\n' @@ -586,16 +586,16 @@ '[3] In pattern matching, a mapping is defined as one of the ' 'following:\n' '\n' - ' * a class that inherits from "collections.abc.Mapping"\n' + ' * a class that inherits from "collections.abc.Mapping"\n' '\n' - ' * a Python class that has been registered as\n' - ' "collections.abc.Mapping"\n' + ' * a Python class that has been registered as\n' + ' "collections.abc.Mapping"\n' '\n' - ' * a builtin class that has its (CPython) ' - '"Py_TPFLAGS_MAPPING"\n' - ' bit set\n' + ' * a builtin class that has its (CPython) "Py_TPFLAGS_MAPPING" ' + 'bit\n' + ' set\n' '\n' - ' * a class that inherits from any of the above\n' + ' * a class that inherits from any of the above\n' '\n' ' The standard library classes "dict" and ' '"types.MappingProxyType"\n' @@ -1181,16 +1181,23 @@ 'attribute references, which most objects do. This ' 'object is then\n' 'asked to produce the attribute whose name is the ' - 'identifier. This\n' - 'production can be customized by overriding the ' - '"__getattr__()" method.\n' - 'If this attribute is not available, the exception ' - '"AttributeError" is\n' - 'raised. Otherwise, the type and value of the object ' - 'produced is\n' - 'determined by the object. Multiple evaluations of ' - 'the same attribute\n' - 'reference may yield different objects.\n', + 'identifier. The type\n' + 'and value produced is determined by the object. ' + 'Multiple evaluations\n' + 'of the same attribute reference may yield different ' + 'objects.\n' + '\n' + 'This production can be customized by overriding the\n' + '"__getattribute__()" method or the "__getattr__()" ' + 'method. The\n' + '"__getattribute__()" method is called first and ' + 'either returns a value\n' + 'or raises "AttributeError" if the attribute is not ' + 'available.\n' + '\n' + 'If an "AttributeError" is raised and the object has ' + 'a "__getattr__()"\n' + 'method, that method is called as a fallback.\n', 'augassign': 'Augmented assignment statements\n' '*******************************\n' '\n' @@ -2872,18 +2879,19 @@ ' bindings made during a successful pattern match outlive the\n' ' executed block and can be used after the match statement**.\n' '\n' - ' Note:\n' + ' Note:\n' '\n' - ' During failed pattern matches, some subpatterns may ' - 'succeed.\n' - ' Do not rely on bindings being made for a failed match.\n' - ' Conversely, do not rely on variables remaining unchanged ' - 'after\n' - ' a failed match. The exact behavior is dependent on\n' - ' implementation and may vary. This is an intentional ' - 'decision\n' - ' made to allow different implementations to add ' - 'optimizations.\n' + ' During failed pattern matches, some subpatterns may ' + 'succeed. Do\n' + ' not rely on bindings being made for a failed match. ' + 'Conversely,\n' + ' do not rely on variables remaining unchanged after a ' + 'failed\n' + ' match. The exact behavior is dependent on implementation ' + 'and may\n' + ' vary. This is an intentional decision made to allow ' + 'different\n' + ' implementations to add optimizations.\n' '\n' '3. If the pattern succeeds, the corresponding guard (if present) ' 'is\n' @@ -3535,9 +3543,10 @@ '* convert "P1" to a keyword pattern using "CLS.__match_args__"\n' '\n' '* For each keyword argument "attr=P2":\n' - ' * "hasattr(, "attr")"\n' '\n' - ' * "P2" matches ".attr"\n' + ' * "hasattr(, "attr")"\n' + '\n' + ' * "P2" matches ".attr"\n' '\n' '* … and so on for the corresponding keyword argument/pattern ' 'pair.\n' @@ -3998,30 +4007,30 @@ '[2] In pattern matching, a sequence is defined as one of the\n' ' following:\n' '\n' - ' * a class that inherits from "collections.abc.Sequence"\n' + ' * a class that inherits from "collections.abc.Sequence"\n' '\n' - ' * a Python class that has been registered as\n' - ' "collections.abc.Sequence"\n' + ' * a Python class that has been registered as\n' + ' "collections.abc.Sequence"\n' '\n' - ' * a builtin class that has its (CPython) ' - '"Py_TPFLAGS_SEQUENCE"\n' - ' bit set\n' + ' * a builtin class that has its (CPython) ' + '"Py_TPFLAGS_SEQUENCE" bit\n' + ' set\n' '\n' - ' * a class that inherits from any of the above\n' + ' * a class that inherits from any of the above\n' '\n' ' The following standard library classes are sequences:\n' '\n' - ' * "array.array"\n' + ' * "array.array"\n' '\n' - ' * "collections.deque"\n' + ' * "collections.deque"\n' '\n' - ' * "list"\n' + ' * "list"\n' '\n' - ' * "memoryview"\n' + ' * "memoryview"\n' '\n' - ' * "range"\n' + ' * "range"\n' '\n' - ' * "tuple"\n' + ' * "tuple"\n' '\n' ' Note:\n' '\n' @@ -4032,16 +4041,16 @@ '[3] In pattern matching, a mapping is defined as one of the ' 'following:\n' '\n' - ' * a class that inherits from "collections.abc.Mapping"\n' + ' * a class that inherits from "collections.abc.Mapping"\n' '\n' - ' * a Python class that has been registered as\n' - ' "collections.abc.Mapping"\n' + ' * a Python class that has been registered as\n' + ' "collections.abc.Mapping"\n' '\n' - ' * a builtin class that has its (CPython) ' - '"Py_TPFLAGS_MAPPING"\n' - ' bit set\n' + ' * a builtin class that has its (CPython) ' + '"Py_TPFLAGS_MAPPING" bit\n' + ' set\n' '\n' - ' * a class that inherits from any of the above\n' + ' * a class that inherits from any of the above\n' '\n' ' The standard library classes "dict" and ' '"types.MappingProxyType"\n' @@ -6078,18 +6087,17 @@ '\n' 'The grammar for a replacement field is as follows:\n' '\n' - ' replacement_field ::= "{" [field_name] ["!" ' - 'conversion] [":" format_spec] "}"\n' - ' field_name ::= arg_name ("." attribute_name | ' - '"[" element_index "]")*\n' - ' arg_name ::= [identifier | digit+]\n' - ' attribute_name ::= identifier\n' - ' element_index ::= digit+ | index_string\n' - ' index_string ::= +\n' - ' conversion ::= "r" | "s" | "a"\n' - ' format_spec ::= \n' + ' replacement_field ::= "{" [field_name] ["!" conversion] ' + '[":" format_spec] "}"\n' + ' field_name ::= arg_name ("." attribute_name | "[" ' + 'element_index "]")*\n' + ' arg_name ::= [identifier | digit+]\n' + ' attribute_name ::= identifier\n' + ' element_index ::= digit+ | index_string\n' + ' index_string ::= ' + '+\n' + ' conversion ::= "r" | "s" | "a"\n' + ' format_spec ::= \n' '\n' 'In less formal terms, the replacement field can start with ' 'a\n' @@ -6275,43 +6283,37 @@ 'The meaning of the various alignment options is as ' 'follows:\n' '\n' - ' ' '+-----------+------------------------------------------------------------+\n' - ' | Option | ' + '| Option | ' 'Meaning ' '|\n' - ' ' '|===========|============================================================|\n' - ' | "\'<\'" | Forces the field to be left-aligned ' - 'within the available |\n' - ' | | space (this is the default for most ' + '| "\'<\'" | Forces the field to be left-aligned within ' + 'the available |\n' + '| | space (this is the default for most ' 'objects). |\n' - ' ' '+-----------+------------------------------------------------------------+\n' - ' | "\'>\'" | Forces the field to be right-aligned ' - 'within the available |\n' - ' | | space (this is the default for ' + '| "\'>\'" | Forces the field to be right-aligned within ' + 'the available |\n' + '| | space (this is the default for ' 'numbers). |\n' - ' ' '+-----------+------------------------------------------------------------+\n' - ' | "\'=\'" | Forces the padding to be placed after ' - 'the sign (if any) |\n' - ' | | but before the digits. This is used for ' + '| "\'=\'" | Forces the padding to be placed after the ' + 'sign (if any) |\n' + '| | but before the digits. This is used for ' 'printing fields |\n' - ' | | in the form ‘+000000120’. This alignment ' + '| | in the form ‘+000000120’. This alignment ' 'option is only |\n' - ' | | valid for numeric types. It becomes the ' + '| | valid for numeric types. It becomes the ' 'default for |\n' - ' | | numbers when ‘0’ immediately precedes the ' + '| | numbers when ‘0’ immediately precedes the ' 'field width. |\n' - ' ' '+-----------+------------------------------------------------------------+\n' - ' | "\'^\'" | Forces the field to be centered within ' - 'the available |\n' - ' | | ' + '| "\'^\'" | Forces the field to be centered within the ' + 'available |\n' + '| | ' 'space. ' '|\n' - ' ' '+-----------+------------------------------------------------------------+\n' '\n' 'Note that unless a minimum field width is defined, the ' @@ -6324,30 +6326,25 @@ 'be one of\n' 'the following:\n' '\n' - ' ' '+-----------+------------------------------------------------------------+\n' - ' | Option | ' + '| Option | ' 'Meaning ' '|\n' - ' ' '|===========|============================================================|\n' - ' | "\'+\'" | indicates that a sign should be used for ' + '| "\'+\'" | indicates that a sign should be used for ' 'both positive as |\n' - ' | | well as negative ' + '| | well as negative ' 'numbers. |\n' - ' ' '+-----------+------------------------------------------------------------+\n' - ' | "\'-\'" | indicates that a sign should be used ' - 'only for negative |\n' - ' | | numbers (this is the default ' + '| "\'-\'" | indicates that a sign should be used only ' + 'for negative |\n' + '| | numbers (this is the default ' 'behavior). |\n' - ' ' '+-----------+------------------------------------------------------------+\n' - ' | space | indicates that a leading space should be ' - 'used on positive |\n' - ' | | numbers, and a minus sign on negative ' + '| space | indicates that a leading space should be used ' + 'on positive |\n' + '| | numbers, and a minus sign on negative ' 'numbers. |\n' - ' ' '+-----------+------------------------------------------------------------+\n' '\n' 'The "\'z\'" option coerces negative zero floating-point ' @@ -12489,11 +12486,9 @@ 'bytes\n' 'literals.\n' '\n' - ' Changed in version 3.6: Unrecognized escape sequences produce ' - 'a\n' - ' "DeprecationWarning". In a future Python version they will be ' - 'a\n' - ' "SyntaxWarning" and eventually a "SyntaxError".\n' + 'Changed in version 3.6: Unrecognized escape sequences produce a\n' + '"DeprecationWarning". In a future Python version they will be a\n' + '"SyntaxWarning" and eventually a "SyntaxError".\n' '\n' 'Even in a raw literal, quotes can be escaped with a backslash, ' 'but the\n' diff --git a/Misc/NEWS.d/3.11.7.rst b/Misc/NEWS.d/3.11.7.rst new file mode 100644 index 00000000000000..334fbac380d5e4 --- /dev/null +++ b/Misc/NEWS.d/3.11.7.rst @@ -0,0 +1,795 @@ +.. date: 2023-12-03-19-34-51 +.. gh-issue: 112625 +.. nonce: QWTlwS +.. release date: 2023-12-04 +.. section: Core and Builtins + +Fixes a bug where a bytearray object could be cleared while iterating over +an argument in the ``bytearray.join()`` method that could result in reading +memory after it was freed. + +.. + +.. date: 2023-11-25-22-58-49 +.. gh-issue: 112388 +.. nonce: MU3cIM +.. section: Core and Builtins + +Fix an error that was causing the parser to try to overwrite tokenizer +errors. Patch by pablo Galindo + +.. + +.. date: 2023-11-25-22-39-44 +.. gh-issue: 112387 +.. nonce: AbBq5W +.. section: Core and Builtins + +Fix error positions for decoded strings with backwards tokenize errors. +Patch by Pablo Galindo + +.. + +.. date: 2023-11-19-15-57-23 +.. gh-issue: 112266 +.. nonce: BSJMbR +.. section: Core and Builtins + +Change docstrings of :attr:`~object.__dict__` and +:attr:`~object.__weakref__`. + +.. + +.. date: 2023-10-31-14-25-21 +.. gh-issue: 109181 +.. nonce: 11h6Mc +.. section: Core and Builtins + +Speed up :obj:`Traceback` object creation by lazily compute the line number. +Patch by Pablo Galindo + +.. + +.. date: 2023-10-27-19-38-33 +.. gh-issue: 102388 +.. nonce: vd5YUZ +.. section: Core and Builtins + +Fix a bug where ``iso2022_jp_3`` and ``iso2022_jp_2004`` codecs read out of +bounds + +.. + +.. date: 2023-10-27-12-17-49 +.. gh-issue: 111366 +.. nonce: _TSknV +.. section: Core and Builtins + +Fix an issue in the :mod:`codeop` that was causing :exc:`SyntaxError` +exceptions raised in the presence of invalid syntax to not contain precise +error messages. Patch by Pablo Galindo + +.. + +.. date: 2023-10-27-11-51-40 +.. gh-issue: 111380 +.. nonce: vgSbir +.. section: Core and Builtins + +Fix a bug that was causing :exc:`SyntaxWarning` to appear twice when parsing +if invalid syntax is encountered later. Patch by Pablo galindo + +.. + +.. date: 2023-10-26-15-34-11 +.. gh-issue: 88116 +.. nonce: W9-vaQ +.. section: Core and Builtins + +Traceback location ranges involving wide unicode characters (like emoji and +asian characters) now are properly highlighted. Patch by Batuhan Taskaya and +Pablo Galindo. + +.. + +.. date: 2023-10-23-22-11-09 +.. gh-issue: 94438 +.. nonce: y2pITu +.. section: Core and Builtins + +Fix a regression that prevented jumping across ``is None`` and ``is not +None`` when debugging. Patch by Savannah Ostrowski. + +.. + +.. date: 2023-10-11-13-46-14 +.. gh-issue: 110696 +.. nonce: J9kSzr +.. section: Core and Builtins + +Fix incorrect error message for invalid argument unpacking. Patch by Pablo +Galindo + +.. + +.. date: 2023-10-02-23-17-08 +.. gh-issue: 110237 +.. nonce: _Xub0z +.. section: Core and Builtins + +Fix missing error checks for calls to ``PyList_Append`` in +``_PyEval_MatchClass``. + +.. + +.. date: 2023-09-11-12-41-42 +.. gh-issue: 109216 +.. nonce: 60QOSb +.. section: Core and Builtins + +Fix possible memory leak in :opcode:`BUILD_MAP`. + +.. + +.. date: 2023-12-02-12-55-17 +.. gh-issue: 112618 +.. nonce: 7_FT8- +.. section: Library + +Fix a caching bug relating to :data:`typing.Annotated`. ``Annotated[str, +True]`` is no longer identical to ``Annotated[str, 1]``. + +.. + +.. date: 2023-11-28-20-01-33 +.. gh-issue: 112509 +.. nonce: QtoKed +.. section: Library + +Fix edge cases that could cause a key to be present in both the +``__required_keys__`` and ``__optional_keys__`` attributes of a +:class:`typing.TypedDict`. Patch by Jelle Zijlstra. + +.. + +.. date: 2023-11-24-21-00-24 +.. gh-issue: 94722 +.. nonce: GMIQIn +.. section: Library + +Fix bug where comparison between instances of :class:`~doctest.DocTest` +fails if one of them has ``None`` as its lineno. + +.. + +.. date: 2023-11-15-04-53-37 +.. gh-issue: 112105 +.. nonce: I3RcVN +.. section: Library + +Make :func:`readline.set_completer_delims` work with libedit + +.. + +.. date: 2023-11-14-18-43-55 +.. gh-issue: 111942 +.. nonce: x1pnrj +.. section: Library + +Fix SystemError in the TextIOWrapper constructor with non-encodable "errors" +argument in non-debug mode. + +.. + +.. date: 2023-11-11-16-42-48 +.. gh-issue: 109538 +.. nonce: cMG5ux +.. section: Library + +Issue warning message instead of having :class:`RuntimeError` be displayed +when event loop has already been closed at :meth:`StreamWriter.__del__`. + +.. + +.. date: 2023-11-10-22-08-28 +.. gh-issue: 111942 +.. nonce: MDFm6v +.. section: Library + +Fix crashes in :meth:`io.TextIOWrapper.reconfigure` when pass invalid +arguments, e.g. non-string encoding. + +.. + +.. date: 2023-11-08-15-58-57 +.. gh-issue: 111804 +.. nonce: uAXTOL +.. section: Library + +Remove posix.fallocate() under WASI as the underlying posix_fallocate() is +not available in WASI preview2. + +.. + +.. date: 2023-11-08-11-50-49 +.. gh-issue: 111841 +.. nonce: iSqdQf +.. section: Library + +Fix truncating arguments on an embedded null character in :meth:`os.putenv` +and :meth:`os.unsetenv` on Windows. + +.. + +.. date: 2023-11-04-10-24-25 +.. gh-issue: 111541 +.. nonce: x0RBI1 +.. section: Library + +Fix :mod:`doctest` for :exc:`SyntaxError` not-builtin subclasses. + +.. + +.. date: 2023-11-01-14-03-24 +.. gh-issue: 110894 +.. nonce: 7-wZxC +.. section: Library + +Call loop exception handler for exceptions in ``client_connected_cb`` of +:func:`asyncio.start_server` so that applications can handle it. Patch by +Kumar Aditya. + +.. + +.. date: 2023-10-31-07-46-56 +.. gh-issue: 111531 +.. nonce: 6zUV_G +.. section: Library + +Fix reference leaks in ``bind_class()`` and ``bind_all()`` methods of +:mod:`tkinter` widgets. + +.. + +.. date: 2023-10-30-08-50-46 +.. gh-issue: 111356 +.. nonce: Bc8LvA +.. section: Library + +Added :func:`io.text_encoding()`, :data:`io.DEFAULT_BUFFER_SIZE`, and +:class:`io.IncrementalNewlineDecoder` to ``io.__all__``. + +.. + +.. date: 2023-10-27-12-46-56 +.. gh-issue: 68166 +.. nonce: 0EbWW4 +.. section: Library + +Remove mention of not supported "vsapi" element type in +:meth:`tkinter.ttk.Style.element_create`. Add tests for ``element_create()`` +and other ``ttk.Style`` methods. Add examples for ``element_create()`` in +the documentation. + +.. + +.. date: 2023-10-24-12-09-46 +.. gh-issue: 111251 +.. nonce: urFYtn +.. section: Library + +Fix :mod:`_blake2` not checking for errors when initializing. + +.. + +.. date: 2023-10-23-13-53-58 +.. gh-issue: 111174 +.. nonce: Oohmzd +.. section: Library + +Fix crash in :meth:`io.BytesIO.getbuffer` called repeatedly for empty +BytesIO. + +.. + +.. date: 2023-10-22-21-28-05 +.. gh-issue: 111187 +.. nonce: _W11Ab +.. section: Library + +Postpone removal version for locale.getdefaultlocale() to Python 3.15. + +.. + +.. date: 2023-10-21-13-57-06 +.. gh-issue: 111159 +.. nonce: GoHp7s +.. section: Library + +Fix :mod:`doctest` output comparison for exceptions with notes. + +.. + +.. date: 2023-10-20-15-29-10 +.. gh-issue: 110910 +.. nonce: u2oPwX +.. section: Library + +Fix invalid state handling in :class:`asyncio.TaskGroup` and +:class:`asyncio.Timeout`. They now raise proper RuntimeError if they are +improperly used and are left in consistent state after this. + +.. + +.. date: 2023-10-19-22-46-34 +.. gh-issue: 111092 +.. nonce: hgut12 +.. section: Library + +Make turtledemo run without default root enabled. + +.. + +.. date: 2023-10-10-10-46-55 +.. gh-issue: 110590 +.. nonce: fatz-h +.. section: Library + +Fix a bug in :meth:`!_sre.compile` where :exc:`TypeError` would be +overwritten by :exc:`OverflowError` when the *code* argument was a list of +non-ints. + +.. + +.. date: 2023-10-09-19-09-32 +.. gh-issue: 65052 +.. nonce: C2mRlo +.. section: Library + +Prevent :mod:`pdb` from crashing when trying to display undisplayable +objects + +.. + +.. date: 2023-10-08-18-15-02 +.. gh-issue: 110519 +.. nonce: RDGe8- +.. section: Library + +Deprecation warning about non-integer number in :mod:`gettext` now alwais +refers to the line in the user code where gettext function or method is +used. Previously it could refer to a line in ``gettext`` code. + +.. + +.. date: 2023-10-07-13-50-12 +.. gh-issue: 110378 +.. nonce: Y4L8fl +.. section: Library + +:func:`~contextlib.contextmanager` and +:func:`~contextlib.asynccontextmanager` context managers now close an +invalid underlying generator object that yields more then one value. + +.. + +.. date: 2023-10-04-18-56-29 +.. gh-issue: 110365 +.. nonce: LCxiau +.. section: Library + +Fix :func:`termios.tcsetattr` bug that was overwritting existing errors +during parsing integers from ``term`` list. + +.. + +.. date: 2023-10-02-05-23-27 +.. gh-issue: 110196 +.. nonce: djwt0z +.. section: Library + +Add ``__reduce__`` method to :class:`IPv6Address` in order to keep +``scope_id`` + +.. + +.. date: 2023-09-25-20-05-41 +.. gh-issue: 109747 +.. nonce: _cRJH8 +.. section: Library + +Improve errors for unsupported look-behind patterns. Now re.error is raised +instead of OverflowError or RuntimeError for too large width of look-behind +pattern. + +.. + +.. date: 2023-09-23-14-40-51 +.. gh-issue: 109786 +.. nonce: UX3pKv +.. section: Library + +Fix possible reference leaks and crash when re-enter the ``__next__()`` +method of :class:`itertools.pairwise`. + +.. + +.. date: 2023-09-02-16-07-23 +.. gh-issue: 108791 +.. nonce: fBcAqh +.. section: Library + +Improved error handling in :mod:`pdb` command line interface, making it +produce more concise error messages. + +.. + +.. date: 2022-05-28-20-55-07 +.. gh-issue: 73561 +.. nonce: YRmAvy +.. section: Library + +Omit the interface scope from an IPv6 address when used as Host header by +:mod:`http.client`. + +.. + +.. date: 2022-05-06-15-49-57 +.. gh-issue: 86826 +.. nonce: rf006W +.. section: Library + +:mod:`zipinfo` now supports the full range of values in the TZ string +determined by RFC 8536 and detects all invalid formats. Both Python and C +implementations now raise exceptions of the same type on invalid data. + +.. + +.. bpo: 41422 +.. date: 2020-07-28-20-48-05 +.. nonce: iMwnMu +.. section: Library + +Fixed memory leaks of :class:`pickle.Pickler` and :class:`pickle.Unpickler` +involving cyclic references via the internal memo mapping. + +.. + +.. bpo: 40262 +.. date: 2020-05-21-23-32-46 +.. nonce: z4fQv1 +.. section: Library + +The :meth:`ssl.SSLSocket.recv_into` method no longer requires the *buffer* +argument to implement ``__len__`` and supports buffers with arbitrary item +size. + +.. + +.. bpo: 35191 +.. date: 2018-11-08-18-44-04 +.. nonce: s29bWu +.. section: Library + +Fix unexpected integer truncation in :meth:`socket.setblocking` which caused +it to interpret multiples of ``2**32`` as ``False``. + +.. + +.. date: 2023-09-03-13-43-49 +.. gh-issue: 108826 +.. nonce: KG7abS +.. section: Documentation + +:mod:`dis` module command-line interface is now mentioned in documentation. + +.. + +.. date: 2023-10-31-22-09-25 +.. gh-issue: 110367 +.. nonce: UhQi44 +.. section: Tests + +Make regrtest ``--verbose3`` option compatible with ``--huntrleaks -jN`` +options. The ``./python -m test -j1 -R 3:3 --verbose3`` command now works as +expected. Patch by Victor Stinner. + +.. + +.. date: 2023-10-25-13-13-30 +.. gh-issue: 111309 +.. nonce: Re7orL +.. section: Tests + +:mod:`distutils` tests can now be run via :mod:`unittest`. + +.. + +.. date: 2023-10-21-19-27-36 +.. gh-issue: 111165 +.. nonce: FU6mUk +.. section: Tests + +Remove no longer used functions ``run_unittest()`` and ``run_doctest()`` and +class ``BasicTestRunner`` from the :mod:`test.support` module. + +.. + +.. date: 2023-10-21-00-10-36 +.. gh-issue: 110932 +.. nonce: jktjJU +.. section: Tests + +Fix regrtest if the ``SOURCE_DATE_EPOCH`` environment variable is defined: +use the variable value as the random seed. Patch by Victor Stinner. + +.. + +.. date: 2023-10-17-17-54-36 +.. gh-issue: 110995 +.. nonce: Fx8KRD +.. section: Tests + +test_gdb: Fix detection of gdb built without Python scripting support. Patch +by Victor Stinner. + +.. + +.. date: 2023-10-16-13-47-24 +.. gh-issue: 110918 +.. nonce: aFgZK3 +.. section: Tests + +Test case matching patterns specified by options ``--match``, ``--ignore``, +``--matchfile`` and ``--ignorefile`` are now tested in the order of +specification, and the last match determines whether the test case be run or +ignored. + +.. + +.. date: 2023-10-10-23-20-13 +.. gh-issue: 110647 +.. nonce: jKG3sY +.. section: Tests + +Fix test_stress_modifying_handlers() of test_signal. Patch by Victor +Stinner. + +.. + +.. date: 2023-10-06-02-32-18 +.. gh-issue: 103053 +.. nonce: VfxBLI +.. section: Tests + +Fix test_tools.test_freeze on FreeBSD: run "make distclean" instead of "make +clean" in the copied source directory to remove also the "python" program. +Patch by Victor Stinner. + +.. + +.. date: 2023-10-05-19-33-49 +.. gh-issue: 110167 +.. nonce: mIdj3v +.. section: Tests + +Fix a deadlock in test_socket when server fails with a timeout but the +client is still running in its thread. Don't hold a lock to call cleanup +functions in doCleanups(). One of the cleanup function waits until the +client completes, whereas the client could deadlock if it called +addCleanup() in such situation. Patch by Victor Stinner. + +.. + +.. date: 2023-10-05-14-22-48 +.. gh-issue: 110388 +.. nonce: 1-HQJO +.. section: Tests + +Add tests for :mod:`tty`. + +.. + +.. date: 2023-10-05-13-46-50 +.. gh-issue: 81002 +.. nonce: bOcuV6 +.. section: Tests + +Add tests for :mod:`termios`. + +.. + +.. date: 2023-10-03-10-54-09 +.. gh-issue: 110267 +.. nonce: O-c47G +.. section: Tests + +Add tests for pickling and copying PyStructSequence objects. Patched by +Xuehai Pan. + +.. + +.. date: 2023-09-29-00-19-21 +.. gh-issue: 109974 +.. nonce: Sh_g-r +.. section: Tests + +Fix race conditions in test_threading lock tests. Wait until a condition is +met rather than using :func:`time.sleep` with a hardcoded number of seconds. +Patch by Victor Stinner. + +.. + +.. date: 2023-09-28-12-25-19 +.. gh-issue: 109972 +.. nonce: GYnwIP +.. section: Tests + +Split test_gdb.py file into a test_gdb package made of multiple tests, so +tests can now be run in parallel. Patch by Victor Stinner. + +.. + +.. date: 2023-09-13-05-58-09 +.. gh-issue: 104736 +.. nonce: lA25Fu +.. section: Tests + +Fix test_gdb on Python built with LLVM clang 16 on Linux ppc64le (ex: Fedora +38). Search patterns in gdb "bt" command output to detect when gdb fails to +retrieve the traceback. For example, skip a test if ``Backtrace stopped: +frame did not save the PC`` is found. Patch by Victor Stinner. + +.. + +.. date: 2023-09-05-20-46-35 +.. gh-issue: 108927 +.. nonce: TpwWav +.. section: Tests + +Fixed order dependence in running tests in the same process when a test that +has submodules (e.g. test_importlib) follows a test that imports its +submodule (e.g. test_importlib.util) and precedes a test (e.g. test_unittest +or test_compileall) that uses that submodule. + +.. + +.. date: 2023-10-06-02-15-23 +.. gh-issue: 103053 +.. nonce: --7JUF +.. section: Build + +"make check-clean-src" now also checks if the "python" program is found in +the source directory: fail with an error if it does exist. Patch by Victor +Stinner. + +.. + +.. date: 2023-10-05-11-46-20 +.. gh-issue: 109191 +.. nonce: imUkVN +.. section: Build + +Fix compile error when building with recent versions of libedit. + +.. + +.. date: 2023-10-19-21-46-18 +.. gh-issue: 110913 +.. nonce: CWlPfg +.. section: Windows + +WindowsConsoleIO now correctly chunks large buffers without splitting up +UTF-8 sequences. + +.. + +.. date: 2023-10-06-14-20-14 +.. gh-issue: 110437 +.. nonce: xpYy9q +.. section: Windows + +Allows overriding the source of VC redistributables so that releases can be +guaranteed to never downgrade between updates. + +.. + +.. date: 2023-10-05-15-23-23 +.. gh-issue: 109286 +.. nonce: N8OzMg +.. section: Windows + +Update Windows installer to use SQLite 3.43.1. + +.. + +.. date: 2023-10-18-17-26-36 +.. gh-issue: 110950 +.. nonce: sonoma +.. section: macOS + +Update macOS installer to include an upstream Tcl/Tk fix for the ``Secure +coding is not enabled for restorable state!`` warning encountered in Tkinter +on macOS 14 Sonoma. + +.. + +.. date: 2023-10-18-01-40-36 +.. gh-issue: 111015 +.. nonce: NaLI2L +.. section: macOS + +Ensure that IDLE.app and Python Launcher.app are installed with appropriate +permissions on macOS builds. + +.. + +.. date: 2023-10-04-23-38-24 +.. gh-issue: 109286 +.. nonce: 1ZLMaq +.. section: macOS + +Update macOS installer to use SQLite 3.43.1. + +.. + +.. date: 2023-09-02-08-49-57 +.. gh-issue: 71383 +.. nonce: Ttkchg +.. section: macOS + +Update macOS installer to include an upstream Tcl/Tk fix for the +``ttk::ThemeChanged`` error encountered in Tkinter. + +.. + +.. date: 2023-08-30-16-33-57 +.. gh-issue: 92603 +.. nonce: ATkKVO +.. section: macOS + +Update macOS installer to include a fix accepted by upstream Tcl/Tk for a +crash encountered after the first :meth:`tkinter.Tk` instance is destroyed. + +.. + +.. date: 2023-05-21-23-54-52 +.. gh-issue: 99834 +.. nonce: 6ANPts +.. section: macOS + +Update macOS installer to Tcl/Tk 8.6.13. + +.. + +.. bpo: 35668 +.. date: 2019-01-07-06-18-25 +.. nonce: JimxP5 +.. section: IDLE + +Add docstrings to the IDLE debugger module. Fix two bugs: initialize +Idb.botframe (should be in Bdb); in Idb.in_rpc_code, check whether +prev_frame is None before trying to use it. Greatly expand test_debugger. + +.. + +.. date: 2023-11-27-09-44-16 +.. gh-issue: 112438 +.. nonce: GdNZiI +.. section: C API + +Fix support of format units "es", "et", "es#", and "et#" in nested tuples in +:c:func:`PyArg_ParseTuple`-like functions. + +.. + +.. date: 2023-09-17-21-47-31 +.. gh-issue: 109521 +.. nonce: JDF6i9 +.. section: C API + +:c:func:`PyImport_GetImporter` now sets RuntimeError if it fails to get +:data:`sys.path_hooks` or :data:`sys.path_importer_cache` or they are not +list and dict correspondingly. Previously it could return NULL without +setting error in obscure cases, crash or raise SystemError if these +attributes have wrong type. diff --git a/Misc/NEWS.d/next/Build/2023-10-05-11-46-20.gh-issue-109191.imUkVN.rst b/Misc/NEWS.d/next/Build/2023-10-05-11-46-20.gh-issue-109191.imUkVN.rst deleted file mode 100644 index 27e5df790bc0c6..00000000000000 --- a/Misc/NEWS.d/next/Build/2023-10-05-11-46-20.gh-issue-109191.imUkVN.rst +++ /dev/null @@ -1 +0,0 @@ -Fix compile error when building with recent versions of libedit. diff --git a/Misc/NEWS.d/next/Build/2023-10-06-02-15-23.gh-issue-103053.--7JUF.rst b/Misc/NEWS.d/next/Build/2023-10-06-02-15-23.gh-issue-103053.--7JUF.rst deleted file mode 100644 index 81aa21357287c7..00000000000000 --- a/Misc/NEWS.d/next/Build/2023-10-06-02-15-23.gh-issue-103053.--7JUF.rst +++ /dev/null @@ -1,3 +0,0 @@ -"make check-clean-src" now also checks if the "python" program is found in -the source directory: fail with an error if it does exist. Patch by Victor -Stinner. diff --git a/Misc/NEWS.d/next/C API/2023-09-17-21-47-31.gh-issue-109521.JDF6i9.rst b/Misc/NEWS.d/next/C API/2023-09-17-21-47-31.gh-issue-109521.JDF6i9.rst deleted file mode 100644 index 338650c9246686..00000000000000 --- a/Misc/NEWS.d/next/C API/2023-09-17-21-47-31.gh-issue-109521.JDF6i9.rst +++ /dev/null @@ -1,5 +0,0 @@ -:c:func:`PyImport_GetImporter` now sets RuntimeError if it fails to get -:data:`sys.path_hooks` or :data:`sys.path_importer_cache` or they are not -list and dict correspondingly. Previously it could return NULL without -setting error in obscure cases, crash or raise SystemError if these -attributes have wrong type. diff --git a/Misc/NEWS.d/next/C API/2023-11-27-09-44-16.gh-issue-112438.GdNZiI.rst b/Misc/NEWS.d/next/C API/2023-11-27-09-44-16.gh-issue-112438.GdNZiI.rst deleted file mode 100644 index 113119efd6aebb..00000000000000 --- a/Misc/NEWS.d/next/C API/2023-11-27-09-44-16.gh-issue-112438.GdNZiI.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix support of format units "es", "et", "es#", and "et#" in nested tuples in -:c:func:`PyArg_ParseTuple`-like functions. diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-09-11-12-41-42.gh-issue-109216.60QOSb.rst b/Misc/NEWS.d/next/Core and Builtins/2023-09-11-12-41-42.gh-issue-109216.60QOSb.rst deleted file mode 100644 index f36310fc5f8064..00000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2023-09-11-12-41-42.gh-issue-109216.60QOSb.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix possible memory leak in :opcode:`BUILD_MAP`. - diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-02-23-17-08.gh-issue-110237._Xub0z.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-02-23-17-08.gh-issue-110237._Xub0z.rst deleted file mode 100644 index 67b95c52f7e4da..00000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2023-10-02-23-17-08.gh-issue-110237._Xub0z.rst +++ /dev/null @@ -1 +0,0 @@ -Fix missing error checks for calls to ``PyList_Append`` in ``_PyEval_MatchClass``. diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-11-13-46-14.gh-issue-110696.J9kSzr.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-11-13-46-14.gh-issue-110696.J9kSzr.rst deleted file mode 100644 index c845289d714f4c..00000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2023-10-11-13-46-14.gh-issue-110696.J9kSzr.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix incorrect error message for invalid argument unpacking. Patch by Pablo -Galindo diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-23-22-11-09.gh-issue-94438.y2pITu.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-23-22-11-09.gh-issue-94438.y2pITu.rst deleted file mode 100644 index b6e147a48a8cd8..00000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2023-10-23-22-11-09.gh-issue-94438.y2pITu.rst +++ /dev/null @@ -1 +0,0 @@ -Fix a regression that prevented jumping across ``is None`` and ``is not None`` when debugging. Patch by Savannah Ostrowski. diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-26-15-34-11.gh-issue-88116.W9-vaQ.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-26-15-34-11.gh-issue-88116.W9-vaQ.rst deleted file mode 100644 index 12257ef2b0b9b0..00000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2023-10-26-15-34-11.gh-issue-88116.W9-vaQ.rst +++ /dev/null @@ -1,3 +0,0 @@ -Traceback location ranges involving wide unicode characters (like emoji and -asian characters) now are properly highlighted. Patch by Batuhan Taskaya and -Pablo Galindo. diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-27-11-51-40.gh-issue-111380.vgSbir.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-27-11-51-40.gh-issue-111380.vgSbir.rst deleted file mode 100644 index 4ce6398dbfe3b1..00000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2023-10-27-11-51-40.gh-issue-111380.vgSbir.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix a bug that was causing :exc:`SyntaxWarning` to appear twice when parsing -if invalid syntax is encountered later. Patch by Pablo galindo diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-27-12-17-49.gh-issue-111366._TSknV.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-27-12-17-49.gh-issue-111366._TSknV.rst deleted file mode 100644 index 7e76ce916ea714..00000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2023-10-27-12-17-49.gh-issue-111366._TSknV.rst +++ /dev/null @@ -1,3 +0,0 @@ -Fix an issue in the :mod:`codeop` that was causing :exc:`SyntaxError` -exceptions raised in the presence of invalid syntax to not contain precise -error messages. Patch by Pablo Galindo diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-27-19-38-33.gh-issue-102388.vd5YUZ.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-27-19-38-33.gh-issue-102388.vd5YUZ.rst deleted file mode 100644 index 268a3d310f2b49..00000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2023-10-27-19-38-33.gh-issue-102388.vd5YUZ.rst +++ /dev/null @@ -1 +0,0 @@ -Fix a bug where ``iso2022_jp_3`` and ``iso2022_jp_2004`` codecs read out of bounds diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-10-31-14-25-21.gh-issue-109181.11h6Mc.rst b/Misc/NEWS.d/next/Core and Builtins/2023-10-31-14-25-21.gh-issue-109181.11h6Mc.rst deleted file mode 100644 index 61a15b471cfb27..00000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2023-10-31-14-25-21.gh-issue-109181.11h6Mc.rst +++ /dev/null @@ -1,2 +0,0 @@ -Speed up :obj:`Traceback` object creation by lazily compute the line number. -Patch by Pablo Galindo diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-11-19-15-57-23.gh-issue-112266.BSJMbR.rst b/Misc/NEWS.d/next/Core and Builtins/2023-11-19-15-57-23.gh-issue-112266.BSJMbR.rst deleted file mode 100644 index 18433db9bb976e..00000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2023-11-19-15-57-23.gh-issue-112266.BSJMbR.rst +++ /dev/null @@ -1,2 +0,0 @@ -Change docstrings of :attr:`~object.__dict__` and -:attr:`~object.__weakref__`. diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-39-44.gh-issue-112387.AbBq5W.rst b/Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-39-44.gh-issue-112387.AbBq5W.rst deleted file mode 100644 index adac11bf4c90a1..00000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-39-44.gh-issue-112387.AbBq5W.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix error positions for decoded strings with backwards tokenize errors. -Patch by Pablo Galindo diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-58-49.gh-issue-112388.MU3cIM.rst b/Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-58-49.gh-issue-112388.MU3cIM.rst deleted file mode 100644 index 1c82be2febda4f..00000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2023-11-25-22-58-49.gh-issue-112388.MU3cIM.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix an error that was causing the parser to try to overwrite tokenizer -errors. Patch by pablo Galindo diff --git a/Misc/NEWS.d/next/Core and Builtins/2023-12-03-19-34-51.gh-issue-112625.QWTlwS.rst b/Misc/NEWS.d/next/Core and Builtins/2023-12-03-19-34-51.gh-issue-112625.QWTlwS.rst deleted file mode 100644 index 4970e10f3f4dcb..00000000000000 --- a/Misc/NEWS.d/next/Core and Builtins/2023-12-03-19-34-51.gh-issue-112625.QWTlwS.rst +++ /dev/null @@ -1 +0,0 @@ -Fixes a bug where a bytearray object could be cleared while iterating over an argument in the ``bytearray.join()`` method that could result in reading memory after it was freed. diff --git a/Misc/NEWS.d/next/Documentation/2023-09-03-13-43-49.gh-issue-108826.KG7abS.rst b/Misc/NEWS.d/next/Documentation/2023-09-03-13-43-49.gh-issue-108826.KG7abS.rst deleted file mode 100644 index 139b8f3457930c..00000000000000 --- a/Misc/NEWS.d/next/Documentation/2023-09-03-13-43-49.gh-issue-108826.KG7abS.rst +++ /dev/null @@ -1 +0,0 @@ -:mod:`dis` module command-line interface is now mentioned in documentation. diff --git a/Misc/NEWS.d/next/IDLE/2019-01-07-06-18-25.bpo-35668.JimxP5.rst b/Misc/NEWS.d/next/IDLE/2019-01-07-06-18-25.bpo-35668.JimxP5.rst deleted file mode 100644 index 8bb5420517d55f..00000000000000 --- a/Misc/NEWS.d/next/IDLE/2019-01-07-06-18-25.bpo-35668.JimxP5.rst +++ /dev/null @@ -1,4 +0,0 @@ -Add docstrings to the IDLE debugger module. Fix two bugs: -initialize Idb.botframe (should be in Bdb); in Idb.in_rpc_code, -check whether prev_frame is None before trying to use it. -Greatly expand test_debugger. diff --git a/Misc/NEWS.d/next/Library/2018-11-08-18-44-04.bpo-35191.s29bWu.rst b/Misc/NEWS.d/next/Library/2018-11-08-18-44-04.bpo-35191.s29bWu.rst deleted file mode 100644 index 37f6f56597b6d4..00000000000000 --- a/Misc/NEWS.d/next/Library/2018-11-08-18-44-04.bpo-35191.s29bWu.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix unexpected integer truncation in :meth:`socket.setblocking` which caused -it to interpret multiples of ``2**32`` as ``False``. diff --git a/Misc/NEWS.d/next/Library/2020-05-21-23-32-46.bpo-40262.z4fQv1.rst b/Misc/NEWS.d/next/Library/2020-05-21-23-32-46.bpo-40262.z4fQv1.rst deleted file mode 100644 index c017a1c8df09d8..00000000000000 --- a/Misc/NEWS.d/next/Library/2020-05-21-23-32-46.bpo-40262.z4fQv1.rst +++ /dev/null @@ -1,2 +0,0 @@ -The :meth:`ssl.SSLSocket.recv_into` method no longer requires the *buffer* -argument to implement ``__len__`` and supports buffers with arbitrary item size. diff --git a/Misc/NEWS.d/next/Library/2020-07-28-20-48-05.bpo-41422.iMwnMu.rst b/Misc/NEWS.d/next/Library/2020-07-28-20-48-05.bpo-41422.iMwnMu.rst deleted file mode 100644 index 8bde68f8f2afc8..00000000000000 --- a/Misc/NEWS.d/next/Library/2020-07-28-20-48-05.bpo-41422.iMwnMu.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fixed memory leaks of :class:`pickle.Pickler` and :class:`pickle.Unpickler` involving cyclic references via the -internal memo mapping. diff --git a/Misc/NEWS.d/next/Library/2022-05-06-15-49-57.gh-issue-86826.rf006W.rst b/Misc/NEWS.d/next/Library/2022-05-06-15-49-57.gh-issue-86826.rf006W.rst deleted file mode 100644 index 02cd75eec4be9e..00000000000000 --- a/Misc/NEWS.d/next/Library/2022-05-06-15-49-57.gh-issue-86826.rf006W.rst +++ /dev/null @@ -1,4 +0,0 @@ -:mod:`zipinfo` now supports the full range of values in the TZ string -determined by RFC 8536 and detects all invalid formats. -Both Python and C implementations now raise exceptions of the same -type on invalid data. diff --git a/Misc/NEWS.d/next/Library/2022-05-28-20-55-07.gh-issue-73561.YRmAvy.rst b/Misc/NEWS.d/next/Library/2022-05-28-20-55-07.gh-issue-73561.YRmAvy.rst deleted file mode 100644 index 5e00b7d20b8ca8..00000000000000 --- a/Misc/NEWS.d/next/Library/2022-05-28-20-55-07.gh-issue-73561.YRmAvy.rst +++ /dev/null @@ -1 +0,0 @@ -Omit the interface scope from an IPv6 address when used as Host header by :mod:`http.client`. diff --git a/Misc/NEWS.d/next/Library/2023-09-02-16-07-23.gh-issue-108791.fBcAqh.rst b/Misc/NEWS.d/next/Library/2023-09-02-16-07-23.gh-issue-108791.fBcAqh.rst deleted file mode 100644 index 84a2cd589e10d5..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-09-02-16-07-23.gh-issue-108791.fBcAqh.rst +++ /dev/null @@ -1 +0,0 @@ -Improved error handling in :mod:`pdb` command line interface, making it produce more concise error messages. diff --git a/Misc/NEWS.d/next/Library/2023-09-23-14-40-51.gh-issue-109786.UX3pKv.rst b/Misc/NEWS.d/next/Library/2023-09-23-14-40-51.gh-issue-109786.UX3pKv.rst deleted file mode 100644 index 07222fa339d703..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-09-23-14-40-51.gh-issue-109786.UX3pKv.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix possible reference leaks and crash when re-enter the ``__next__()`` method of -:class:`itertools.pairwise`. diff --git a/Misc/NEWS.d/next/Library/2023-09-25-20-05-41.gh-issue-109747._cRJH8.rst b/Misc/NEWS.d/next/Library/2023-09-25-20-05-41.gh-issue-109747._cRJH8.rst deleted file mode 100644 index b64ba627897a1a..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-09-25-20-05-41.gh-issue-109747._cRJH8.rst +++ /dev/null @@ -1,3 +0,0 @@ -Improve errors for unsupported look-behind patterns. Now re.error is raised -instead of OverflowError or RuntimeError for too large width of look-behind -pattern. diff --git a/Misc/NEWS.d/next/Library/2023-10-02-05-23-27.gh-issue-110196.djwt0z.rst b/Misc/NEWS.d/next/Library/2023-10-02-05-23-27.gh-issue-110196.djwt0z.rst deleted file mode 100644 index 341f3380fffd60..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-10-02-05-23-27.gh-issue-110196.djwt0z.rst +++ /dev/null @@ -1 +0,0 @@ -Add ``__reduce__`` method to :class:`IPv6Address` in order to keep ``scope_id`` diff --git a/Misc/NEWS.d/next/Library/2023-10-04-18-56-29.gh-issue-110365.LCxiau.rst b/Misc/NEWS.d/next/Library/2023-10-04-18-56-29.gh-issue-110365.LCxiau.rst deleted file mode 100644 index a1ac39b60296a3..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-10-04-18-56-29.gh-issue-110365.LCxiau.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix :func:`termios.tcsetattr` bug that was overwritting existing errors -during parsing integers from ``term`` list. diff --git a/Misc/NEWS.d/next/Library/2023-10-07-13-50-12.gh-issue-110378.Y4L8fl.rst b/Misc/NEWS.d/next/Library/2023-10-07-13-50-12.gh-issue-110378.Y4L8fl.rst deleted file mode 100644 index ef5395fc3c6420..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-10-07-13-50-12.gh-issue-110378.Y4L8fl.rst +++ /dev/null @@ -1,3 +0,0 @@ -:func:`~contextlib.contextmanager` and -:func:`~contextlib.asynccontextmanager` context managers now close an invalid -underlying generator object that yields more then one value. diff --git a/Misc/NEWS.d/next/Library/2023-10-08-18-15-02.gh-issue-110519.RDGe8-.rst b/Misc/NEWS.d/next/Library/2023-10-08-18-15-02.gh-issue-110519.RDGe8-.rst deleted file mode 100644 index 8ff916736584dc..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-10-08-18-15-02.gh-issue-110519.RDGe8-.rst +++ /dev/null @@ -1,3 +0,0 @@ -Deprecation warning about non-integer number in :mod:`gettext` now alwais -refers to the line in the user code where gettext function or method is -used. Previously it could refer to a line in ``gettext`` code. diff --git a/Misc/NEWS.d/next/Library/2023-10-09-19-09-32.gh-issue-65052.C2mRlo.rst b/Misc/NEWS.d/next/Library/2023-10-09-19-09-32.gh-issue-65052.C2mRlo.rst deleted file mode 100644 index 4739c63bb3cc9d..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-10-09-19-09-32.gh-issue-65052.C2mRlo.rst +++ /dev/null @@ -1 +0,0 @@ -Prevent :mod:`pdb` from crashing when trying to display undisplayable objects diff --git a/Misc/NEWS.d/next/Library/2023-10-10-10-46-55.gh-issue-110590.fatz-h.rst b/Misc/NEWS.d/next/Library/2023-10-10-10-46-55.gh-issue-110590.fatz-h.rst deleted file mode 100644 index 20dc3fff205994..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-10-10-10-46-55.gh-issue-110590.fatz-h.rst +++ /dev/null @@ -1,3 +0,0 @@ -Fix a bug in :meth:`!_sre.compile` where :exc:`TypeError` -would be overwritten by :exc:`OverflowError` when -the *code* argument was a list of non-ints. diff --git a/Misc/NEWS.d/next/Library/2023-10-19-22-46-34.gh-issue-111092.hgut12.rst b/Misc/NEWS.d/next/Library/2023-10-19-22-46-34.gh-issue-111092.hgut12.rst deleted file mode 100644 index 487bd177d27e31..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-10-19-22-46-34.gh-issue-111092.hgut12.rst +++ /dev/null @@ -1 +0,0 @@ -Make turtledemo run without default root enabled. diff --git a/Misc/NEWS.d/next/Library/2023-10-20-15-29-10.gh-issue-110910.u2oPwX.rst b/Misc/NEWS.d/next/Library/2023-10-20-15-29-10.gh-issue-110910.u2oPwX.rst deleted file mode 100644 index c750447e9fe4a5..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-10-20-15-29-10.gh-issue-110910.u2oPwX.rst +++ /dev/null @@ -1,3 +0,0 @@ -Fix invalid state handling in :class:`asyncio.TaskGroup` and -:class:`asyncio.Timeout`. They now raise proper RuntimeError if they are -improperly used and are left in consistent state after this. diff --git a/Misc/NEWS.d/next/Library/2023-10-21-13-57-06.gh-issue-111159.GoHp7s.rst b/Misc/NEWS.d/next/Library/2023-10-21-13-57-06.gh-issue-111159.GoHp7s.rst deleted file mode 100644 index bdec4f4443d80b..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-10-21-13-57-06.gh-issue-111159.GoHp7s.rst +++ /dev/null @@ -1 +0,0 @@ -Fix :mod:`doctest` output comparison for exceptions with notes. diff --git a/Misc/NEWS.d/next/Library/2023-10-22-21-28-05.gh-issue-111187._W11Ab.rst b/Misc/NEWS.d/next/Library/2023-10-22-21-28-05.gh-issue-111187._W11Ab.rst deleted file mode 100644 index dc2424370bb96c..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-10-22-21-28-05.gh-issue-111187._W11Ab.rst +++ /dev/null @@ -1 +0,0 @@ -Postpone removal version for locale.getdefaultlocale() to Python 3.15. diff --git a/Misc/NEWS.d/next/Library/2023-10-23-13-53-58.gh-issue-111174.Oohmzd.rst b/Misc/NEWS.d/next/Library/2023-10-23-13-53-58.gh-issue-111174.Oohmzd.rst deleted file mode 100644 index 95c315404d0ee6..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-10-23-13-53-58.gh-issue-111174.Oohmzd.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix crash in :meth:`io.BytesIO.getbuffer` called repeatedly for empty -BytesIO. diff --git a/Misc/NEWS.d/next/Library/2023-10-24-12-09-46.gh-issue-111251.urFYtn.rst b/Misc/NEWS.d/next/Library/2023-10-24-12-09-46.gh-issue-111251.urFYtn.rst deleted file mode 100644 index 3a87cb25da5cb4..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-10-24-12-09-46.gh-issue-111251.urFYtn.rst +++ /dev/null @@ -1 +0,0 @@ -Fix :mod:`_blake2` not checking for errors when initializing. diff --git a/Misc/NEWS.d/next/Library/2023-10-27-12-46-56.gh-issue-68166.0EbWW4.rst b/Misc/NEWS.d/next/Library/2023-10-27-12-46-56.gh-issue-68166.0EbWW4.rst deleted file mode 100644 index 757a7004cc1dc0..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-10-27-12-46-56.gh-issue-68166.0EbWW4.rst +++ /dev/null @@ -1,4 +0,0 @@ -Remove mention of not supported "vsapi" element type in -:meth:`tkinter.ttk.Style.element_create`. Add tests for ``element_create()`` -and other ``ttk.Style`` methods. Add examples for ``element_create()`` in -the documentation. diff --git a/Misc/NEWS.d/next/Library/2023-10-30-08-50-46.gh-issue-111356.Bc8LvA.rst b/Misc/NEWS.d/next/Library/2023-10-30-08-50-46.gh-issue-111356.Bc8LvA.rst deleted file mode 100644 index a821b52b982175..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-10-30-08-50-46.gh-issue-111356.Bc8LvA.rst +++ /dev/null @@ -1 +0,0 @@ -Added :func:`io.text_encoding()`, :data:`io.DEFAULT_BUFFER_SIZE`, and :class:`io.IncrementalNewlineDecoder` to ``io.__all__``. diff --git a/Misc/NEWS.d/next/Library/2023-10-31-07-46-56.gh-issue-111531.6zUV_G.rst b/Misc/NEWS.d/next/Library/2023-10-31-07-46-56.gh-issue-111531.6zUV_G.rst deleted file mode 100644 index b722f0414184b1..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-10-31-07-46-56.gh-issue-111531.6zUV_G.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix reference leaks in ``bind_class()`` and ``bind_all()`` methods of -:mod:`tkinter` widgets. diff --git a/Misc/NEWS.d/next/Library/2023-11-01-14-03-24.gh-issue-110894.7-wZxC.rst b/Misc/NEWS.d/next/Library/2023-11-01-14-03-24.gh-issue-110894.7-wZxC.rst deleted file mode 100644 index c59fe6b9119eca..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-11-01-14-03-24.gh-issue-110894.7-wZxC.rst +++ /dev/null @@ -1 +0,0 @@ -Call loop exception handler for exceptions in ``client_connected_cb`` of :func:`asyncio.start_server` so that applications can handle it. Patch by Kumar Aditya. diff --git a/Misc/NEWS.d/next/Library/2023-11-04-10-24-25.gh-issue-111541.x0RBI1.rst b/Misc/NEWS.d/next/Library/2023-11-04-10-24-25.gh-issue-111541.x0RBI1.rst deleted file mode 100644 index 719b63dad36fb7..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-11-04-10-24-25.gh-issue-111541.x0RBI1.rst +++ /dev/null @@ -1 +0,0 @@ -Fix :mod:`doctest` for :exc:`SyntaxError` not-builtin subclasses. diff --git a/Misc/NEWS.d/next/Library/2023-11-08-11-50-49.gh-issue-111841.iSqdQf.rst b/Misc/NEWS.d/next/Library/2023-11-08-11-50-49.gh-issue-111841.iSqdQf.rst deleted file mode 100644 index cd1780988aeac7..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-11-08-11-50-49.gh-issue-111841.iSqdQf.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix truncating arguments on an embedded null character in :meth:`os.putenv` -and :meth:`os.unsetenv` on Windows. diff --git a/Misc/NEWS.d/next/Library/2023-11-08-15-58-57.gh-issue-111804.uAXTOL.rst b/Misc/NEWS.d/next/Library/2023-11-08-15-58-57.gh-issue-111804.uAXTOL.rst deleted file mode 100644 index 2696f2f492a8b0..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-11-08-15-58-57.gh-issue-111804.uAXTOL.rst +++ /dev/null @@ -1,2 +0,0 @@ -Remove posix.fallocate() under WASI as the underlying posix_fallocate() is -not available in WASI preview2. diff --git a/Misc/NEWS.d/next/Library/2023-11-10-22-08-28.gh-issue-111942.MDFm6v.rst b/Misc/NEWS.d/next/Library/2023-11-10-22-08-28.gh-issue-111942.MDFm6v.rst deleted file mode 100644 index 4fc505c8f257a6..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-11-10-22-08-28.gh-issue-111942.MDFm6v.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix crashes in :meth:`io.TextIOWrapper.reconfigure` when pass invalid -arguments, e.g. non-string encoding. diff --git a/Misc/NEWS.d/next/Library/2023-11-11-16-42-48.gh-issue-109538.cMG5ux.rst b/Misc/NEWS.d/next/Library/2023-11-11-16-42-48.gh-issue-109538.cMG5ux.rst deleted file mode 100644 index d1ee4c054a3f19..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-11-11-16-42-48.gh-issue-109538.cMG5ux.rst +++ /dev/null @@ -1 +0,0 @@ -Issue warning message instead of having :class:`RuntimeError` be displayed when event loop has already been closed at :meth:`StreamWriter.__del__`. diff --git a/Misc/NEWS.d/next/Library/2023-11-14-18-43-55.gh-issue-111942.x1pnrj.rst b/Misc/NEWS.d/next/Library/2023-11-14-18-43-55.gh-issue-111942.x1pnrj.rst deleted file mode 100644 index ca58a6fa5d6ae1..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-11-14-18-43-55.gh-issue-111942.x1pnrj.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix SystemError in the TextIOWrapper constructor with non-encodable "errors" -argument in non-debug mode. diff --git a/Misc/NEWS.d/next/Library/2023-11-15-04-53-37.gh-issue-112105.I3RcVN.rst b/Misc/NEWS.d/next/Library/2023-11-15-04-53-37.gh-issue-112105.I3RcVN.rst deleted file mode 100644 index 4243dcb190434f..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-11-15-04-53-37.gh-issue-112105.I3RcVN.rst +++ /dev/null @@ -1 +0,0 @@ -Make :func:`readline.set_completer_delims` work with libedit diff --git a/Misc/NEWS.d/next/Library/2023-11-24-21-00-24.gh-issue-94722.GMIQIn.rst b/Misc/NEWS.d/next/Library/2023-11-24-21-00-24.gh-issue-94722.GMIQIn.rst deleted file mode 100644 index 41bd57f46ed82a..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-11-24-21-00-24.gh-issue-94722.GMIQIn.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix bug where comparison between instances of :class:`~doctest.DocTest` fails if -one of them has ``None`` as its lineno. diff --git a/Misc/NEWS.d/next/Library/2023-11-28-20-01-33.gh-issue-112509.QtoKed.rst b/Misc/NEWS.d/next/Library/2023-11-28-20-01-33.gh-issue-112509.QtoKed.rst deleted file mode 100644 index a16d67e7776bcb..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-11-28-20-01-33.gh-issue-112509.QtoKed.rst +++ /dev/null @@ -1,3 +0,0 @@ -Fix edge cases that could cause a key to be present in both the -``__required_keys__`` and ``__optional_keys__`` attributes of a -:class:`typing.TypedDict`. Patch by Jelle Zijlstra. diff --git a/Misc/NEWS.d/next/Library/2023-12-02-12-55-17.gh-issue-112618.7_FT8-.rst b/Misc/NEWS.d/next/Library/2023-12-02-12-55-17.gh-issue-112618.7_FT8-.rst deleted file mode 100644 index c732de15609c96..00000000000000 --- a/Misc/NEWS.d/next/Library/2023-12-02-12-55-17.gh-issue-112618.7_FT8-.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix a caching bug relating to :data:`typing.Annotated`. -``Annotated[str, True]`` is no longer identical to ``Annotated[str, 1]``. diff --git a/Misc/NEWS.d/next/Tests/2023-09-05-20-46-35.gh-issue-108927.TpwWav.rst b/Misc/NEWS.d/next/Tests/2023-09-05-20-46-35.gh-issue-108927.TpwWav.rst deleted file mode 100644 index b1a78370afedb2..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-09-05-20-46-35.gh-issue-108927.TpwWav.rst +++ /dev/null @@ -1,4 +0,0 @@ -Fixed order dependence in running tests in the same process -when a test that has submodules (e.g. test_importlib) follows a test that -imports its submodule (e.g. test_importlib.util) and precedes a test -(e.g. test_unittest or test_compileall) that uses that submodule. diff --git a/Misc/NEWS.d/next/Tests/2023-09-13-05-58-09.gh-issue-104736.lA25Fu.rst b/Misc/NEWS.d/next/Tests/2023-09-13-05-58-09.gh-issue-104736.lA25Fu.rst deleted file mode 100644 index 85c370fc87ac41..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-09-13-05-58-09.gh-issue-104736.lA25Fu.rst +++ /dev/null @@ -1,4 +0,0 @@ -Fix test_gdb on Python built with LLVM clang 16 on Linux ppc64le (ex: Fedora -38). Search patterns in gdb "bt" command output to detect when gdb fails to -retrieve the traceback. For example, skip a test if ``Backtrace stopped: frame -did not save the PC`` is found. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Tests/2023-09-28-12-25-19.gh-issue-109972.GYnwIP.rst b/Misc/NEWS.d/next/Tests/2023-09-28-12-25-19.gh-issue-109972.GYnwIP.rst deleted file mode 100644 index 7b6007678388b1..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-09-28-12-25-19.gh-issue-109972.GYnwIP.rst +++ /dev/null @@ -1,2 +0,0 @@ -Split test_gdb.py file into a test_gdb package made of multiple tests, so tests -can now be run in parallel. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Tests/2023-09-29-00-19-21.gh-issue-109974.Sh_g-r.rst b/Misc/NEWS.d/next/Tests/2023-09-29-00-19-21.gh-issue-109974.Sh_g-r.rst deleted file mode 100644 index a130cf690a57cb..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-09-29-00-19-21.gh-issue-109974.Sh_g-r.rst +++ /dev/null @@ -1,3 +0,0 @@ -Fix race conditions in test_threading lock tests. Wait until a condition is met -rather than using :func:`time.sleep` with a hardcoded number of seconds. Patch -by Victor Stinner. diff --git a/Misc/NEWS.d/next/Tests/2023-10-03-10-54-09.gh-issue-110267.O-c47G.rst b/Misc/NEWS.d/next/Tests/2023-10-03-10-54-09.gh-issue-110267.O-c47G.rst deleted file mode 100644 index 2bae7715cc3d5b..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-10-03-10-54-09.gh-issue-110267.O-c47G.rst +++ /dev/null @@ -1,2 +0,0 @@ -Add tests for pickling and copying PyStructSequence objects. -Patched by Xuehai Pan. diff --git a/Misc/NEWS.d/next/Tests/2023-10-05-13-46-50.gh-issue-81002.bOcuV6.rst b/Misc/NEWS.d/next/Tests/2023-10-05-13-46-50.gh-issue-81002.bOcuV6.rst deleted file mode 100644 index d69f6746d9e9aa..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-10-05-13-46-50.gh-issue-81002.bOcuV6.rst +++ /dev/null @@ -1 +0,0 @@ -Add tests for :mod:`termios`. diff --git a/Misc/NEWS.d/next/Tests/2023-10-05-14-22-48.gh-issue-110388.1-HQJO.rst b/Misc/NEWS.d/next/Tests/2023-10-05-14-22-48.gh-issue-110388.1-HQJO.rst deleted file mode 100644 index caac41f81547de..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-10-05-14-22-48.gh-issue-110388.1-HQJO.rst +++ /dev/null @@ -1 +0,0 @@ -Add tests for :mod:`tty`. diff --git a/Misc/NEWS.d/next/Tests/2023-10-05-19-33-49.gh-issue-110167.mIdj3v.rst b/Misc/NEWS.d/next/Tests/2023-10-05-19-33-49.gh-issue-110167.mIdj3v.rst deleted file mode 100644 index d0cbbf9c3788bb..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-10-05-19-33-49.gh-issue-110167.mIdj3v.rst +++ /dev/null @@ -1,5 +0,0 @@ -Fix a deadlock in test_socket when server fails with a timeout but the -client is still running in its thread. Don't hold a lock to call cleanup -functions in doCleanups(). One of the cleanup function waits until the -client completes, whereas the client could deadlock if it called -addCleanup() in such situation. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Tests/2023-10-06-02-32-18.gh-issue-103053.VfxBLI.rst b/Misc/NEWS.d/next/Tests/2023-10-06-02-32-18.gh-issue-103053.VfxBLI.rst deleted file mode 100644 index 90a7ca512c97dd..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-10-06-02-32-18.gh-issue-103053.VfxBLI.rst +++ /dev/null @@ -1,3 +0,0 @@ -Fix test_tools.test_freeze on FreeBSD: run "make distclean" instead of "make -clean" in the copied source directory to remove also the "python" program. -Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Tests/2023-10-10-23-20-13.gh-issue-110647.jKG3sY.rst b/Misc/NEWS.d/next/Tests/2023-10-10-23-20-13.gh-issue-110647.jKG3sY.rst deleted file mode 100644 index 00f38c844755ec..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-10-10-23-20-13.gh-issue-110647.jKG3sY.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix test_stress_modifying_handlers() of test_signal. Patch by Victor -Stinner. diff --git a/Misc/NEWS.d/next/Tests/2023-10-16-13-47-24.gh-issue-110918.aFgZK3.rst b/Misc/NEWS.d/next/Tests/2023-10-16-13-47-24.gh-issue-110918.aFgZK3.rst deleted file mode 100644 index 7cb79c0cbf29f1..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-10-16-13-47-24.gh-issue-110918.aFgZK3.rst +++ /dev/null @@ -1,4 +0,0 @@ -Test case matching patterns specified by options ``--match``, ``--ignore``, -``--matchfile`` and ``--ignorefile`` are now tested in the order of -specification, and the last match determines whether the test case be run or -ignored. diff --git a/Misc/NEWS.d/next/Tests/2023-10-17-17-54-36.gh-issue-110995.Fx8KRD.rst b/Misc/NEWS.d/next/Tests/2023-10-17-17-54-36.gh-issue-110995.Fx8KRD.rst deleted file mode 100644 index db29eaf234b731..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-10-17-17-54-36.gh-issue-110995.Fx8KRD.rst +++ /dev/null @@ -1,2 +0,0 @@ -test_gdb: Fix detection of gdb built without Python scripting support. Patch -by Victor Stinner. diff --git a/Misc/NEWS.d/next/Tests/2023-10-21-00-10-36.gh-issue-110932.jktjJU.rst b/Misc/NEWS.d/next/Tests/2023-10-21-00-10-36.gh-issue-110932.jktjJU.rst deleted file mode 100644 index 45bb0774a9abe3..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-10-21-00-10-36.gh-issue-110932.jktjJU.rst +++ /dev/null @@ -1,2 +0,0 @@ -Fix regrtest if the ``SOURCE_DATE_EPOCH`` environment variable is defined: -use the variable value as the random seed. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Tests/2023-10-21-19-27-36.gh-issue-111165.FU6mUk.rst b/Misc/NEWS.d/next/Tests/2023-10-21-19-27-36.gh-issue-111165.FU6mUk.rst deleted file mode 100644 index 753a3c00029ceb..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-10-21-19-27-36.gh-issue-111165.FU6mUk.rst +++ /dev/null @@ -1,2 +0,0 @@ -Remove no longer used functions ``run_unittest()`` and ``run_doctest()`` and -class ``BasicTestRunner`` from the :mod:`test.support` module. diff --git a/Misc/NEWS.d/next/Tests/2023-10-25-13-13-30.gh-issue-111309.Re7orL.rst b/Misc/NEWS.d/next/Tests/2023-10-25-13-13-30.gh-issue-111309.Re7orL.rst deleted file mode 100644 index ed0d3947ad3494..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-10-25-13-13-30.gh-issue-111309.Re7orL.rst +++ /dev/null @@ -1 +0,0 @@ -:mod:`distutils` tests can now be run via :mod:`unittest`. diff --git a/Misc/NEWS.d/next/Tests/2023-10-31-22-09-25.gh-issue-110367.UhQi44.rst b/Misc/NEWS.d/next/Tests/2023-10-31-22-09-25.gh-issue-110367.UhQi44.rst deleted file mode 100644 index 0254288d3626cc..00000000000000 --- a/Misc/NEWS.d/next/Tests/2023-10-31-22-09-25.gh-issue-110367.UhQi44.rst +++ /dev/null @@ -1,3 +0,0 @@ -Make regrtest ``--verbose3`` option compatible with ``--huntrleaks -jN`` -options. The ``./python -m test -j1 -R 3:3 --verbose3`` command now works as -expected. Patch by Victor Stinner. diff --git a/Misc/NEWS.d/next/Windows/2023-10-05-15-23-23.gh-issue-109286.N8OzMg.rst b/Misc/NEWS.d/next/Windows/2023-10-05-15-23-23.gh-issue-109286.N8OzMg.rst deleted file mode 100644 index 14a2aff70803ce..00000000000000 --- a/Misc/NEWS.d/next/Windows/2023-10-05-15-23-23.gh-issue-109286.N8OzMg.rst +++ /dev/null @@ -1 +0,0 @@ -Update Windows installer to use SQLite 3.43.1. diff --git a/Misc/NEWS.d/next/Windows/2023-10-06-14-20-14.gh-issue-110437.xpYy9q.rst b/Misc/NEWS.d/next/Windows/2023-10-06-14-20-14.gh-issue-110437.xpYy9q.rst deleted file mode 100644 index 777b4942e183f5..00000000000000 --- a/Misc/NEWS.d/next/Windows/2023-10-06-14-20-14.gh-issue-110437.xpYy9q.rst +++ /dev/null @@ -1,2 +0,0 @@ -Allows overriding the source of VC redistributables so that releases can be -guaranteed to never downgrade between updates. diff --git a/Misc/NEWS.d/next/Windows/2023-10-19-21-46-18.gh-issue-110913.CWlPfg.rst b/Misc/NEWS.d/next/Windows/2023-10-19-21-46-18.gh-issue-110913.CWlPfg.rst deleted file mode 100644 index d4c1b56d98ef0e..00000000000000 --- a/Misc/NEWS.d/next/Windows/2023-10-19-21-46-18.gh-issue-110913.CWlPfg.rst +++ /dev/null @@ -1 +0,0 @@ -WindowsConsoleIO now correctly chunks large buffers without splitting up UTF-8 sequences. diff --git a/Misc/NEWS.d/next/macOS/2023-05-21-23-54-52.gh-issue-99834.6ANPts.rst b/Misc/NEWS.d/next/macOS/2023-05-21-23-54-52.gh-issue-99834.6ANPts.rst deleted file mode 100644 index aeb64d25b09add..00000000000000 --- a/Misc/NEWS.d/next/macOS/2023-05-21-23-54-52.gh-issue-99834.6ANPts.rst +++ /dev/null @@ -1 +0,0 @@ -Update macOS installer to Tcl/Tk 8.6.13. diff --git a/Misc/NEWS.d/next/macOS/2023-08-30-16-33-57.gh-issue-92603.ATkKVO.rst b/Misc/NEWS.d/next/macOS/2023-08-30-16-33-57.gh-issue-92603.ATkKVO.rst deleted file mode 100644 index 477463c0c21264..00000000000000 --- a/Misc/NEWS.d/next/macOS/2023-08-30-16-33-57.gh-issue-92603.ATkKVO.rst +++ /dev/null @@ -1,3 +0,0 @@ -Update macOS installer to include a fix accepted by upstream Tcl/Tk -for a crash encountered after the first :meth:`tkinter.Tk` instance -is destroyed. diff --git a/Misc/NEWS.d/next/macOS/2023-09-02-08-49-57.gh-issue-71383.Ttkchg.rst b/Misc/NEWS.d/next/macOS/2023-09-02-08-49-57.gh-issue-71383.Ttkchg.rst deleted file mode 100644 index d8f3e429aab815..00000000000000 --- a/Misc/NEWS.d/next/macOS/2023-09-02-08-49-57.gh-issue-71383.Ttkchg.rst +++ /dev/null @@ -1,2 +0,0 @@ -Update macOS installer to include an upstream Tcl/Tk fix -for the ``ttk::ThemeChanged`` error encountered in Tkinter. diff --git a/Misc/NEWS.d/next/macOS/2023-10-04-23-38-24.gh-issue-109286.1ZLMaq.rst b/Misc/NEWS.d/next/macOS/2023-10-04-23-38-24.gh-issue-109286.1ZLMaq.rst deleted file mode 100644 index 18ac9df73deb37..00000000000000 --- a/Misc/NEWS.d/next/macOS/2023-10-04-23-38-24.gh-issue-109286.1ZLMaq.rst +++ /dev/null @@ -1 +0,0 @@ -Update macOS installer to use SQLite 3.43.1. diff --git a/Misc/NEWS.d/next/macOS/2023-10-18-01-40-36.gh-issue-111015.NaLI2L.rst b/Misc/NEWS.d/next/macOS/2023-10-18-01-40-36.gh-issue-111015.NaLI2L.rst deleted file mode 100644 index 4c6eea136554c8..00000000000000 --- a/Misc/NEWS.d/next/macOS/2023-10-18-01-40-36.gh-issue-111015.NaLI2L.rst +++ /dev/null @@ -1 +0,0 @@ -Ensure that IDLE.app and Python Launcher.app are installed with appropriate permissions on macOS builds. diff --git a/Misc/NEWS.d/next/macOS/2023-10-18-17-26-36.gh-issue-110950.sonoma.rst b/Misc/NEWS.d/next/macOS/2023-10-18-17-26-36.gh-issue-110950.sonoma.rst deleted file mode 100644 index c678c09f6aa55b..00000000000000 --- a/Misc/NEWS.d/next/macOS/2023-10-18-17-26-36.gh-issue-110950.sonoma.rst +++ /dev/null @@ -1,3 +0,0 @@ -Update macOS installer to include an upstream Tcl/Tk fix for the -``Secure coding is not enabled for restorable state!`` warning -encountered in Tkinter on macOS 14 Sonoma. diff --git a/README.rst b/README.rst index 1b82fa67d32da4..5910128874fbb9 100644 --- a/README.rst +++ b/README.rst @@ -1,4 +1,4 @@ -This is Python version 3.11.6 +This is Python version 3.11.7 ============================= .. image:: https://github.com/python/cpython/workflows/Tests/badge.svg