>>> py3-cloudpickle: Building community/py3-cloudpickle 3.1.1-r0 (using abuild 3.15.0-r0) started Sun, 12 Oct 2025 15:44:08 +0000 >>> py3-cloudpickle: Validating /home/udu/aports/community/py3-cloudpickle/APKBUILD... >>> py3-cloudpickle: Analyzing dependencies... >>> py3-cloudpickle: Installing for build: build-base py3-flit-core py3-gpep517 py3-installer py3-wheel py3-pytest py3-psutil py3-tornado py3-typing-extensions py3-numpy WARNING: opening /home/udu/packages//community: No such file or directory WARNING: opening /home/udu/packages//main: No such file or directory fetch http://dl-cdn.alpinelinux.org/alpine/v3.22/main/x86_64/APKINDEX.tar.gz fetch http://dl-cdn.alpinelinux.org/alpine/v3.22/community/x86_64/APKINDEX.tar.gz (1/33) Installing py3-flit-core-pyc (3.12.0-r0) (2/33) Installing py3-flit-core (3.12.0-r0) (3/33) Installing py3-installer (0.7.0-r2) (4/33) Installing py3-installer-pyc (0.7.0-r2) (5/33) Installing py3-gpep517 (19-r0) (6/33) Installing py3-gpep517-pyc (19-r0) (7/33) Installing py3-parsing (3.2.3-r0) (8/33) Installing py3-parsing-pyc (3.2.3-r0) (9/33) Installing py3-packaging (25.0-r0) (10/33) Installing py3-packaging-pyc (25.0-r0) (11/33) Installing py3-wheel (0.46.1-r0) (12/33) Installing py3-wheel-pyc (0.46.1-r0) (13/33) Installing py3-iniconfig (2.1.0-r0) (14/33) Installing py3-iniconfig-pyc (2.1.0-r0) (15/33) Installing py3-pluggy (1.5.0-r0) (16/33) Installing py3-pluggy-pyc (1.5.0-r0) (17/33) Installing py3-py (1.11.0-r4) (18/33) Installing py3-py-pyc (1.11.0-r4) (19/33) Installing py3-pytest (8.3.5-r0) (20/33) Installing py3-pytest-pyc (8.3.5-r0) (21/33) Installing py3-psutil (7.0.0-r0) (22/33) Installing py3-psutil-pyc (7.0.0-r0) (23/33) Installing py3-tornado (6.5.1-r0) (24/33) Installing py3-tornado-pyc (6.5.1-r0) (25/33) Installing py3-typing-extensions (4.13.2-r0) (26/33) Installing py3-typing-extensions-pyc (4.13.2-r0) (27/33) Installing libquadmath (14.2.0-r6) (28/33) Installing libgfortran (14.2.0-r6) (29/33) Installing openblas (0.3.28-r0) (30/33) Installing py3-numpy (2.2.4-r1) (31/33) Installing py3-numpy-tests (2.2.4-r1) (32/33) Installing py3-numpy-pyc (2.2.4-r1) (33/33) Installing .makedepends-py3-cloudpickle (20251012.154408) Executing busybox-1.37.0-r19.trigger OK: 378 MiB in 122 packages >>> py3-cloudpickle: Cleaning up srcdir >>> py3-cloudpickle: Cleaning up pkgdir >>> py3-cloudpickle: Cleaning up tmpdir >>> py3-cloudpickle: Fetching py3-cloudpickle-3.1.1.tar.gz::https://github.com/cloudpipe/cloudpickle/archive/refs/tags/v3.1.1.tar.gz >>> py3-cloudpickle: Fetching py3-cloudpickle-3.1.1.tar.gz::https://github.com/cloudpipe/cloudpickle/archive/refs/tags/v3.1.1.tar.gz >>> py3-cloudpickle: Checking sha512sums... py3-cloudpickle-3.1.1.tar.gz: OK >>> py3-cloudpickle: Unpacking /var/cache/distfiles/py3-cloudpickle-3.1.1.tar.gz... 2025-10-12 15:44:09,664 gpep517 INFO Building wheel via backend flit_core.buildapi 2025-10-12 15:44:09,677 flit_core.wheel INFO Zip timestamps will be from SOURCE_DATE_EPOCH: 2025-07-21 04:56:13+00:00 2025-10-12 15:44:09,677 flit_core.wheel INFO Copying package file(s) from cloudpickle 2025-10-12 15:44:09,678 flit_core.wheel INFO Writing metadata files 2025-10-12 15:44:09,678 flit_core.wheel INFO Writing the record of files 2025-10-12 15:44:09,678 flit_core.wheel INFO Built wheel: .dist/cloudpickle-3.1.1-py3-none-any.whl 2025-10-12 15:44:09,678 gpep517 INFO The backend produced .dist/cloudpickle-3.1.1-py3-none-any.whl cloudpickle-3.1.1-py3-none-any.whl ============================= test session starts ============================== platform linux -- Python 3.12.11, pytest-8.3.5, pluggy-1.5.0 rootdir: /home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1 configfile: tox.ini collected 272 items tests/cloudpickle_file_test.py ....... tests/cloudpickle_test.py .......s........s...............F.F...FxF............FF.....FFFFFFF............F........s..F........sF...s.s.....s...................s........s...............F.F...FxF............FF.....FFFFFFF............F........s..F......s.sF...s.s.....s..............s.s. tests/test_backward_compat.py sssssss =================================== FAILURES =================================== _ CloudPickleTest.test_deterministic_dynamic_class_attr_ordering_for_chained_pickling _ self = def test_deterministic_dynamic_class_attr_ordering_for_chained_pickling(self): # Check that the pickle produced by pickling a reconstructed class definition # in a remote process matches the pickle produced by pickling the original # class definition. # In particular, this test checks that the order of the class attributes is # deterministic. > with subprocess_worker(protocol=self.protocol) as w: tests/cloudpickle_test.py:2082: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError _ CloudPickleTest.test_deterministic_str_interning_for_chained_dynamic_class_pickling _ self = def test_deterministic_str_interning_for_chained_dynamic_class_pickling(self): # Check that the pickle produced by the unpickled instance is the same. # This checks that there is no issue related to the string interning of # the names of attributes of class definitions and names of attributes # of the `__code__` objects of the methods. > with subprocess_worker(protocol=self.protocol) as w: tests/cloudpickle_test.py:2121: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError __ CloudPickleTest.test_dynamic_class_determinist_subworker_tuple_memoization __ self = def test_dynamic_class_determinist_subworker_tuple_memoization(self): # Check that the pickle produced by the unpickled instance is the same. # This highlights some issues with tuple memoization. > with subprocess_worker(protocol=self.protocol) as w: tests/cloudpickle_test.py:2160: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError __________ CloudPickleTest.test_dynamic_func_deterministic_roundtrip ___________ self = def test_dynamic_func_deterministic_roundtrip(self): # Check that the pickle serialization for a dynamic func is the same # in two processes. def get_dynamic_func_pickle(): def test_method(arg_1, arg_2): pass return cloudpickle.dumps(test_method) > with subprocess_worker(protocol=self.protocol) as w: tests/cloudpickle_test.py:2048: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError ____________________ CloudPickleTest.test_generic_subclass _____________________ self = def test_generic_subclass(self): T = typing.TypeVar("T") class Base(typing.Generic[T]): pass class DerivedAny(Base): pass class LeafAny(DerivedAny): pass class DerivedInt(Base[int]): pass class LeafInt(DerivedInt): pass class DerivedT(Base[T]): pass class LeafT(DerivedT[T]): pass klasses = [Base, DerivedAny, LeafAny, DerivedInt, LeafInt, DerivedT, LeafT] for klass in klasses: assert pickle_depickle(klass, protocol=self.protocol) is klass > with subprocess_worker(protocol=self.protocol) as worker: tests/cloudpickle_test.py:2633: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError ______________________ CloudPickleTest.test_generic_type _______________________ self = def test_generic_type(self): T = typing.TypeVar("T") class C(typing.Generic[T]): pass assert pickle_depickle(C, protocol=self.protocol) is C # Identity is not part of the typing contract: only test for # equality instead. assert pickle_depickle(C[int], protocol=self.protocol) == C[int] > with subprocess_worker(protocol=self.protocol) as worker: tests/cloudpickle_test.py:2587: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError ______ CloudPickleTest.test_interactive_dynamic_type_and_remote_instances ______ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n\n with subprocess_worker(protoco...ounter\n assert isinstance(c1, CustomCounter)\n assert isinstance(c2, CustomCounter)\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: > out = check_output(cmd, **kwargs) tests/testutils.py:204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/subprocess.py:466: in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = 60, check = True popenargs = (['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpyxl6lvl__src_test_cloudpickle.py'],) kwargs = {'cwd': '/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1', 'env': {'ABUILD_LAST_COMMIT': '', 'BUILDCC...ack-clash-protection -Wformat -Werror=format-security -fno-plt', 'BUILDCPPFLAGS': '', ...}, 'stderr': -2, 'stdout': -1} process = stdout = b'Traceback (most recent call last):\n File "/tmp/tmpyxl6lvl__src_test_cloudpickle.py", line 4, in \n with...ocessing.SemLock(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\nPermissionError: [Errno 13] Permission denied\n' stderr = None, retcode = 1 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout (seconds) is given and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpyxl6lvl__src_test_cloudpickle.py']' returned non-zero exit status 1. /usr/lib/python3.12/subprocess.py:571: CalledProcessError The above exception was the direct cause of the following exception: self = def test_interactive_dynamic_type_and_remote_instances(self): code = """if __name__ == "__main__": from testutils import subprocess_worker with subprocess_worker(protocol={protocol}) as w: class CustomCounter: def __init__(self): self.count = 0 def increment(self): self.count += 1 return self counter = CustomCounter().increment() assert counter.count == 1 returned_counter = w.run(counter.increment) assert returned_counter.count == 2, returned_counter.count # Check that the class definition of the returned instance was # matched back to the original class definition living in __main__. assert isinstance(returned_counter, CustomCounter) # Check that memoization does not break provenance tracking: def echo(*args): return args C1, C2, c1, c2 = w.run(echo, CustomCounter, CustomCounter, CustomCounter(), returned_counter) assert C1 is CustomCounter assert C2 is CustomCounter assert isinstance(c1, CustomCounter) assert isinstance(c2, CustomCounter) """.format(protocol=self.protocol) > assert_run_python_script(code) tests/cloudpickle_test.py:1959: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n\n with subprocess_worker(protoco...ounter\n assert isinstance(c1, CustomCounter)\n assert isinstance(c2, CustomCounter)\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: out = check_output(cmd, **kwargs) except CalledProcessError as e: > raise RuntimeError( "script errored with output:\n%s" % e.output.decode("utf-8") ) from e E RuntimeError: script errored with output: E Traceback (most recent call last): E File "/tmp/tmpyxl6lvl__src_test_cloudpickle.py", line 4, in E with subprocess_worker(protocol=5) as w: E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/contextlib.py", line 137, in __enter__ E return next(self.gen) E ^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 174, in subprocess_worker E worker = _Worker(protocol=protocol) E ^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 141, in __init__ E self.pool = ProcessPoolExecutor(max_workers=1) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 747, in __init__ E self._call_queue = _SafeQueue( E ^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 177, in __init__ E super().__init__(max_size, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/queues.py", line 43, in __init__ E self._rlock = ctx.Lock() E ^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/context.py", line 68, in Lock E return Lock(ctx=self.get_context()) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 169, in __init__ E SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 57, in __init__ E sl = self._semlock = _multiprocessing.SemLock( E ^^^^^^^^^^^^^^^^^^^^^^^^^ E PermissionError: [Errno 13] Permission denied tests/testutils.py:206: RuntimeError __ CloudPickleTest.test_interactive_dynamic_type_and_stored_remote_instances ___ source_code = 'if __name__ == "__main__":\n import cloudpickle, uuid\n from testutils import subprocess_worker\n\n ... class\n # method:\n assert w.run(lambda obj_id: lookup(obj_id).echo(43), id2) == 43\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: > out = check_output(cmd, **kwargs) tests/testutils.py:204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/subprocess.py:466: in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = 60, check = True popenargs = (['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpavrffltn_src_test_cloudpickle.py'],) kwargs = {'cwd': '/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1', 'env': {'ABUILD_LAST_COMMIT': '', 'BUILDCC...ack-clash-protection -Wformat -Werror=format-security -fno-plt', 'BUILDCPPFLAGS': '', ...}, 'stderr': -2, 'stdout': -1} process = stdout = b'Traceback (most recent call last):\n File "/tmp/tmpavrffltn_src_test_cloudpickle.py", line 5, in \n with...ocessing.SemLock(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\nPermissionError: [Errno 13] Permission denied\n' stderr = None, retcode = 1 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout (seconds) is given and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpavrffltn_src_test_cloudpickle.py']' returned non-zero exit status 1. /usr/lib/python3.12/subprocess.py:571: CalledProcessError The above exception was the direct cause of the following exception: self = def test_interactive_dynamic_type_and_stored_remote_instances(self): """Simulate objects stored on workers to check isinstance semantics Such instances stored in the memory of running worker processes are similar to dask-distributed futures for instance. """ code = """if __name__ == "__main__": import cloudpickle, uuid from testutils import subprocess_worker with subprocess_worker(protocol={protocol}) as w: class A: '''Original class definition''' pass def store(x): storage = getattr(cloudpickle, "_test_storage", None) if storage is None: storage = cloudpickle._test_storage = dict() obj_id = uuid.uuid4().hex storage[obj_id] = x return obj_id def lookup(obj_id): return cloudpickle._test_storage[obj_id] id1 = w.run(store, A()) # The stored object on the worker is matched to a singleton class # definition thanks to provenance tracking: assert w.run(lambda obj_id: isinstance(lookup(obj_id), A), id1) # Retrieving the object from the worker yields a local copy that # is matched back the local class definition this instance # originally stems from. assert isinstance(w.run(lookup, id1), A) # Changing the local class definition should be taken into account # in all subsequent calls. In particular the old instances on the # worker do not map back to the new class definition, neither on # the worker itself, nor locally on the main program when the old # instance is retrieved: class A: '''Updated class definition''' assert not w.run(lambda obj_id: isinstance(lookup(obj_id), A), id1) retrieved1 = w.run(lookup, id1) assert not isinstance(retrieved1, A) assert retrieved1.__class__ is not A assert retrieved1.__class__.__doc__ == "Original class definition" # New instances on the other hand are proper instances of the new # class definition everywhere: a = A() id2 = w.run(store, a) assert w.run(lambda obj_id: isinstance(lookup(obj_id), A), id2) assert isinstance(w.run(lookup, id2), A) # Monkeypatch the class defintion in the main process to a new # class method: A.echo = lambda cls, x: x # Calling this method on an instance will automatically update # the remote class definition on the worker to propagate the monkey # patch dynamically. assert w.run(a.echo, 42) == 42 # The stored instance can therefore also access the new class # method: assert w.run(lambda obj_id: lookup(obj_id).echo(43), id2) == 43 """.format(protocol=self.protocol) > assert_run_python_script(code) tests/cloudpickle_test.py:2036: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ source_code = 'if __name__ == "__main__":\n import cloudpickle, uuid\n from testutils import subprocess_worker\n\n ... class\n # method:\n assert w.run(lambda obj_id: lookup(obj_id).echo(43), id2) == 43\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: out = check_output(cmd, **kwargs) except CalledProcessError as e: > raise RuntimeError( "script errored with output:\n%s" % e.output.decode("utf-8") ) from e E RuntimeError: script errored with output: E Traceback (most recent call last): E File "/tmp/tmpavrffltn_src_test_cloudpickle.py", line 5, in E with subprocess_worker(protocol=5) as w: E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/contextlib.py", line 137, in __enter__ E return next(self.gen) E ^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 174, in subprocess_worker E worker = _Worker(protocol=protocol) E ^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 141, in __init__ E self.pool = ProcessPoolExecutor(max_workers=1) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 747, in __init__ E self._call_queue = _SafeQueue( E ^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 177, in __init__ E super().__init__(max_size, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/queues.py", line 43, in __init__ E self._rlock = ctx.Lock() E ^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/context.py", line 68, in Lock E return Lock(ctx=self.get_context()) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 169, in __init__ E SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 57, in __init__ E sl = self._semlock = _multiprocessing.SemLock( E ^^^^^^^^^^^^^^^^^^^^^^^^^ E PermissionError: [Errno 13] Permission denied tests/testutils.py:206: RuntimeError ____________ CloudPickleTest.test_interactive_remote_function_calls ____________ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n\n def interactive_function(x):\n... # previous definition of `interactive_function`:\n\n assert w.run(wrapper_func, 41) == 40\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: > out = check_output(cmd, **kwargs) tests/testutils.py:204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/subprocess.py:466: in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = 60, check = True popenargs = (['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpxz9fx_yt_src_test_cloudpickle.py'],) kwargs = {'cwd': '/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1', 'env': {'ABUILD_LAST_COMMIT': '', 'BUILDCC...ack-clash-protection -Wformat -Werror=format-security -fno-plt', 'BUILDCPPFLAGS': '', ...}, 'stderr': -2, 'stdout': -1} process = stdout = b'Traceback (most recent call last):\n File "/tmp/tmpxz9fx_yt_src_test_cloudpickle.py", line 7, in \n with...ocessing.SemLock(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\nPermissionError: [Errno 13] Permission denied\n' stderr = None, retcode = 1 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout (seconds) is given and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpxz9fx_yt_src_test_cloudpickle.py']' returned non-zero exit status 1. /usr/lib/python3.12/subprocess.py:571: CalledProcessError The above exception was the direct cause of the following exception: self = def test_interactive_remote_function_calls(self): code = """if __name__ == "__main__": from testutils import subprocess_worker def interactive_function(x): return x + 1 with subprocess_worker(protocol={protocol}) as w: assert w.run(interactive_function, 41) == 42 # Define a new function that will call an updated version of # the previously called function: def wrapper_func(x): return interactive_function(x) def interactive_function(x): return x - 1 # The change in the definition of interactive_function in the main # module of the main process should be reflected transparently # in the worker process: the worker process does not recall the # previous definition of `interactive_function`: assert w.run(wrapper_func, 41) == 40 """.format(protocol=self.protocol) > assert_run_python_script(code) tests/cloudpickle_test.py:1876: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n\n def interactive_function(x):\n... # previous definition of `interactive_function`:\n\n assert w.run(wrapper_func, 41) == 40\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: out = check_output(cmd, **kwargs) except CalledProcessError as e: > raise RuntimeError( "script errored with output:\n%s" % e.output.decode("utf-8") ) from e E RuntimeError: script errored with output: E Traceback (most recent call last): E File "/tmp/tmpxz9fx_yt_src_test_cloudpickle.py", line 7, in E with subprocess_worker(protocol=5) as w: E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/contextlib.py", line 137, in __enter__ E return next(self.gen) E ^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 174, in subprocess_worker E worker = _Worker(protocol=protocol) E ^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 141, in __init__ E self.pool = ProcessPoolExecutor(max_workers=1) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 747, in __init__ E self._call_queue = _SafeQueue( E ^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 177, in __init__ E super().__init__(max_size, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/queues.py", line 43, in __init__ E self._rlock = ctx.Lock() E ^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/context.py", line 68, in Lock E return Lock(ctx=self.get_context()) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 169, in __init__ E SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 57, in __init__ E sl = self._semlock = _multiprocessing.SemLock( E ^^^^^^^^^^^^^^^^^^^^^^^^^ E PermissionError: [Errno 13] Permission denied tests/testutils.py:206: RuntimeError ____ CloudPickleTest.test_interactive_remote_function_calls_no_memory_leak _____ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n import struct\n\n with su... # iterations instead of 100 as used now (100x more data)\n assert growth < 5e7, growth\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: > out = check_output(cmd, **kwargs) tests/testutils.py:204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/subprocess.py:466: in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = 60, check = True popenargs = (['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpfe1_gvak_src_test_cloudpickle.py'],) kwargs = {'cwd': '/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1', 'env': {'ABUILD_LAST_COMMIT': '', 'BUILDCC...ack-clash-protection -Wformat -Werror=format-security -fno-plt', 'BUILDCPPFLAGS': '', ...}, 'stderr': -2, 'stdout': -1} process = stdout = b'Traceback (most recent call last):\n File "/tmp/tmpfe1_gvak_src_test_cloudpickle.py", line 5, in \n with...ocessing.SemLock(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\nPermissionError: [Errno 13] Permission denied\n' stderr = None, retcode = 1 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout (seconds) is given and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpfe1_gvak_src_test_cloudpickle.py']' returned non-zero exit status 1. /usr/lib/python3.12/subprocess.py:571: CalledProcessError The above exception was the direct cause of the following exception: self = @pytest.mark.skipif( platform.python_implementation() == "PyPy", reason="Skip PyPy because memory grows too much", ) def test_interactive_remote_function_calls_no_memory_leak(self): code = """if __name__ == "__main__": from testutils import subprocess_worker import struct with subprocess_worker(protocol={protocol}) as w: reference_size = w.memsize() assert reference_size > 0 def make_big_closure(i): # Generate a byte string of size 1MB itemsize = len(struct.pack("l", 1)) data = struct.pack("l", i) * (int(1e6) // itemsize) def process_data(): return len(data) return process_data for i in range(100): func = make_big_closure(i) result = w.run(func) assert result == int(1e6), result import gc w.run(gc.collect) # By this time the worker process has processed 100MB worth of data # passed in the closures. The worker memory size should not have # grown by more than a few MB as closures are garbage collected at # the end of each remote function call. growth = w.memsize() - reference_size # For some reason, the memory growth after processing 100MB of # data is ~50MB on MacOS, and ~1MB on Linux, so the upper bound on # memory growth we use is only tight for MacOS. However, # - 50MB is still 2x lower than the expected memory growth in case # of a leak (which would be the total size of the processed data, # 100MB) # - the memory usage growth does not increase if using 10000 # iterations instead of 100 as used now (100x more data) assert growth < 5e7, growth """.format(protocol=self.protocol) > assert_run_python_script(code) tests/cloudpickle_test.py:2224: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n import struct\n\n with su... # iterations instead of 100 as used now (100x more data)\n assert growth < 5e7, growth\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: out = check_output(cmd, **kwargs) except CalledProcessError as e: > raise RuntimeError( "script errored with output:\n%s" % e.output.decode("utf-8") ) from e E RuntimeError: script errored with output: E Traceback (most recent call last): E File "/tmp/tmpfe1_gvak_src_test_cloudpickle.py", line 5, in E with subprocess_worker(protocol=5) as w: E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/contextlib.py", line 137, in __enter__ E return next(self.gen) E ^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 174, in subprocess_worker E worker = _Worker(protocol=protocol) E ^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 141, in __init__ E self.pool = ProcessPoolExecutor(max_workers=1) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 747, in __init__ E self._call_queue = _SafeQueue( E ^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 177, in __init__ E super().__init__(max_size, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/queues.py", line 43, in __init__ E self._rlock = ctx.Lock() E ^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/context.py", line 68, in Lock E return Lock(ctx=self.get_context()) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 169, in __init__ E SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 57, in __init__ E sl = self._semlock = _multiprocessing.SemLock( E ^^^^^^^^^^^^^^^^^^^^^^^^^ E PermissionError: [Errno 13] Permission denied tests/testutils.py:206: RuntimeError ____ CloudPickleTest.test_interactive_remote_function_calls_no_side_effect _____ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n import sys\n\n with subpr... assert is_in_main("GLOBAL_VARIABLE")\n assert not w.run(is_in_main, "GLOBAL_VARIABLE")\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: > out = check_output(cmd, **kwargs) tests/testutils.py:204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/subprocess.py:466: in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = 60, check = True popenargs = (['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmp5_z752y5_src_test_cloudpickle.py'],) kwargs = {'cwd': '/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1', 'env': {'ABUILD_LAST_COMMIT': '', 'BUILDCC...ack-clash-protection -Wformat -Werror=format-security -fno-plt', 'BUILDCPPFLAGS': '', ...}, 'stderr': -2, 'stdout': -1} process = stdout = b'Traceback (most recent call last):\n File "/tmp/tmp5_z752y5_src_test_cloudpickle.py", line 5, in \n with...ocessing.SemLock(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\nPermissionError: [Errno 13] Permission denied\n' stderr = None, retcode = 1 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout (seconds) is given and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmp5_z752y5_src_test_cloudpickle.py']' returned non-zero exit status 1. /usr/lib/python3.12/subprocess.py:571: CalledProcessError The above exception was the direct cause of the following exception: self = def test_interactive_remote_function_calls_no_side_effect(self): code = """if __name__ == "__main__": from testutils import subprocess_worker import sys with subprocess_worker(protocol={protocol}) as w: GLOBAL_VARIABLE = 0 class CustomClass(object): def mutate_globals(self): global GLOBAL_VARIABLE GLOBAL_VARIABLE += 1 return GLOBAL_VARIABLE custom_object = CustomClass() assert w.run(custom_object.mutate_globals) == 1 # The caller global variable is unchanged in the main process. assert GLOBAL_VARIABLE == 0 # Calling the same function again starts again from zero. The # worker process is stateless: it has no memory of the past call: assert w.run(custom_object.mutate_globals) == 1 # The symbols defined in the main process __main__ module are # not set in the worker process main module to leave the worker # as stateless as possible: def is_in_main(name): return hasattr(sys.modules["__main__"], name) assert is_in_main("CustomClass") assert not w.run(is_in_main, "CustomClass") assert is_in_main("GLOBAL_VARIABLE") assert not w.run(is_in_main, "GLOBAL_VARIABLE") """.format(protocol=self.protocol) > assert_run_python_script(code) tests/cloudpickle_test.py:1920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n import sys\n\n with subpr... assert is_in_main("GLOBAL_VARIABLE")\n assert not w.run(is_in_main, "GLOBAL_VARIABLE")\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: out = check_output(cmd, **kwargs) except CalledProcessError as e: > raise RuntimeError( "script errored with output:\n%s" % e.output.decode("utf-8") ) from e E RuntimeError: script errored with output: E Traceback (most recent call last): E File "/tmp/tmp5_z752y5_src_test_cloudpickle.py", line 5, in E with subprocess_worker(protocol=5) as w: E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/contextlib.py", line 137, in __enter__ E return next(self.gen) E ^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 174, in subprocess_worker E worker = _Worker(protocol=protocol) E ^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 141, in __init__ E self.pool = ProcessPoolExecutor(max_workers=1) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 747, in __init__ E self._call_queue = _SafeQueue( E ^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 177, in __init__ E super().__init__(max_size, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/queues.py", line 43, in __init__ E self._rlock = ctx.Lock() E ^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/context.py", line 68, in Lock E return Lock(ctx=self.get_context()) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 169, in __init__ E SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 57, in __init__ E sl = self._semlock = _multiprocessing.SemLock( E ^^^^^^^^^^^^^^^^^^^^^^^^^ E PermissionError: [Errno 13] Permission denied tests/testutils.py:206: RuntimeError _ CloudPickleTest.test_interactively_defined_dataclass_with_initvar_and_classvar _ source_code = 'if __name__ == "__main__":\n import dataclasses\n from testutils import subprocess_worker\n impo... assert cloned_type is SampleDataclass\n assert isinstance(cloned_value, SampleDataclass)\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: > out = check_output(cmd, **kwargs) tests/testutils.py:204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/subprocess.py:466: in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = 60, check = True popenargs = (['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpx79zz3rt_src_test_cloudpickle.py'],) kwargs = {'cwd': '/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1', 'env': {'ABUILD_LAST_COMMIT': '', 'BUILDCC...ack-clash-protection -Wformat -Werror=format-security -fno-plt', 'BUILDCPPFLAGS': '', ...}, 'stderr': -2, 'stdout': -1} process = stdout = b'Traceback (most recent call last):\n File "/tmp/tmpx79zz3rt_src_test_cloudpickle.py", line 6, in \n with...ocessing.SemLock(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\nPermissionError: [Errno 13] Permission denied\n' stderr = None, retcode = 1 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout (seconds) is given and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpx79zz3rt_src_test_cloudpickle.py']' returned non-zero exit status 1. /usr/lib/python3.12/subprocess.py:571: CalledProcessError The above exception was the direct cause of the following exception: self = def test_interactively_defined_dataclass_with_initvar_and_classvar(self): code = """if __name__ == "__main__": import dataclasses from testutils import subprocess_worker import typing with subprocess_worker(protocol={protocol}) as w: @dataclasses.dataclass class SampleDataclass: x: int y: dataclasses.InitVar[int] = None z: typing.ClassVar[int] = 42 def __post_init__(self, y=0): self.x += y def large_enough(self): return self.x > self.z value = SampleDataclass(2, y=2) def check_dataclass_instance(value): assert isinstance(value, SampleDataclass) assert value.x == 4 assert value.z == 42 expected_dict = dict(x=4) assert dataclasses.asdict(value) == expected_dict assert not value.large_enough() try: SampleDataclass.z = 0 assert value.z == 0 assert value.large_enough() finally: SampleDataclass.z = 42 return "ok" assert check_dataclass_instance(value) == "ok" # Check that this instance of an interactively defined dataclass # behavesconsistently in a remote worker process: assert w.run(check_dataclass_instance, value) == "ok" # Check class provenance tracking is not impacted by the # @dataclass decorator: def echo(*args): return args cloned_value, cloned_type = w.run(echo, value, SampleDataclass) assert cloned_type is SampleDataclass assert isinstance(cloned_value, SampleDataclass) """.format(protocol=self.protocol) > assert_run_python_script(code) tests/cloudpickle_test.py:3035: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ source_code = 'if __name__ == "__main__":\n import dataclasses\n from testutils import subprocess_worker\n impo... assert cloned_type is SampleDataclass\n assert isinstance(cloned_value, SampleDataclass)\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: out = check_output(cmd, **kwargs) except CalledProcessError as e: > raise RuntimeError( "script errored with output:\n%s" % e.output.decode("utf-8") ) from e E RuntimeError: script errored with output: E Traceback (most recent call last): E File "/tmp/tmpx79zz3rt_src_test_cloudpickle.py", line 6, in E with subprocess_worker(protocol=5) as w: E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/contextlib.py", line 137, in __enter__ E return next(self.gen) E ^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 174, in subprocess_worker E worker = _Worker(protocol=protocol) E ^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 141, in __init__ E self.pool = ProcessPoolExecutor(max_workers=1) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 747, in __init__ E self._call_queue = _SafeQueue( E ^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 177, in __init__ E super().__init__(max_size, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/queues.py", line 43, in __init__ E self._rlock = ctx.Lock() E ^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/context.py", line 68, in Lock E return Lock(ctx=self.get_context()) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 169, in __init__ E SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 57, in __init__ E sl = self._semlock = _multiprocessing.SemLock( E ^^^^^^^^^^^^^^^^^^^^^^^^^ E PermissionError: [Errno 13] Permission denied tests/testutils.py:206: RuntimeError _______________ CloudPickleTest.test_interactively_defined_enum ________________ source_code = 'if __name__ == "__main__":\n from enum import Enum\n from testutils import subprocess_worker\n\n ...= 0 else Color.RED\n\n result = w.run(check_positive, 1)\n assert result is Color.BLUE\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: > out = check_output(cmd, **kwargs) tests/testutils.py:204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/subprocess.py:466: in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = 60, check = True popenargs = (['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmps674sa47_src_test_cloudpickle.py'],) kwargs = {'cwd': '/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1', 'env': {'ABUILD_LAST_COMMIT': '', 'BUILDCC...ack-clash-protection -Wformat -Werror=format-security -fno-plt', 'BUILDCPPFLAGS': '', ...}, 'stderr': -2, 'stdout': -1} process = stdout = b'Traceback (most recent call last):\n File "/tmp/tmps674sa47_src_test_cloudpickle.py", line 5, in \n with...ocessing.SemLock(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\nPermissionError: [Errno 13] Permission denied\n' stderr = None, retcode = 1 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout (seconds) is given and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmps674sa47_src_test_cloudpickle.py']' returned non-zero exit status 1. /usr/lib/python3.12/subprocess.py:571: CalledProcessError The above exception was the direct cause of the following exception: self = def test_interactively_defined_enum(self): code = """if __name__ == "__main__": from enum import Enum from testutils import subprocess_worker with subprocess_worker(protocol={protocol}) as w: class Color(Enum): RED = 1 GREEN = 2 def check_positive(x): return Color.GREEN if x >= 0 else Color.RED result = w.run(check_positive, 1) # Check that the returned enum instance is reconciled with the # locally defined Color enum type definition: assert result is Color.GREEN # Check that changing the definition of the Enum class is taken # into account on the worker for subsequent calls: class Color(Enum): RED = 1 BLUE = 2 def check_positive(x): return Color.BLUE if x >= 0 else Color.RED result = w.run(check_positive, 1) assert result is Color.BLUE """.format(protocol=self.protocol) > assert_run_python_script(code) tests/cloudpickle_test.py:2426: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ source_code = 'if __name__ == "__main__":\n from enum import Enum\n from testutils import subprocess_worker\n\n ...= 0 else Color.RED\n\n result = w.run(check_positive, 1)\n assert result is Color.BLUE\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: out = check_output(cmd, **kwargs) except CalledProcessError as e: > raise RuntimeError( "script errored with output:\n%s" % e.output.decode("utf-8") ) from e E RuntimeError: script errored with output: E Traceback (most recent call last): E File "/tmp/tmps674sa47_src_test_cloudpickle.py", line 5, in E with subprocess_worker(protocol=5) as w: E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/contextlib.py", line 137, in __enter__ E return next(self.gen) E ^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 174, in subprocess_worker E worker = _Worker(protocol=protocol) E ^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 141, in __init__ E self.pool = ProcessPoolExecutor(max_workers=1) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 747, in __init__ E self._call_queue = _SafeQueue( E ^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 177, in __init__ E super().__init__(max_size, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/queues.py", line 43, in __init__ E self._rlock = ctx.Lock() E ^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/context.py", line 68, in Lock E return Lock(ctx=self.get_context()) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 169, in __init__ E SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 57, in __init__ E sl = self._semlock = _multiprocessing.SemLock( E ^^^^^^^^^^^^^^^^^^^^^^^^^ E PermissionError: [Errno 13] Permission denied tests/testutils.py:206: RuntimeError __________ CloudPickleTest.test_locally_defined_class_with_type_hints __________ self = def test_locally_defined_class_with_type_hints(self): > with subprocess_worker(protocol=self.protocol) as worker: tests/cloudpickle_test.py:2645: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError _______________ CloudPickleTest.test_multiprocessing_lock_raises _______________ self = def test_multiprocessing_lock_raises(self): > lock = multiprocessing.Lock() tests/cloudpickle_test.py:1180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError _ CloudPickleTest.test_pickle_constructs_from_module_registered_for_pickling_by_value _ self = def test_pickle_constructs_from_module_registered_for_pickling_by_value( self, ): # noqa _prev_sys_path = sys.path.copy() try: # We simulate an interactive session that: # - we start from the /path/to/cloudpickle/tests directory, where a # local .py file (mock_local_file) is located. # - uses constructs from mock_local_file in remote workers that do # not have access to this file. This situation is # the justification behind the # (un)register_pickle_by_value(module) api that cloudpickle # exposes. _mock_interactive_session_cwd = os.path.dirname(__file__) # First, remove sys.path entries that could point to # /path/to/cloudpickle/tests and be in inherited by the worker _maybe_remove(sys.path, "") _maybe_remove(sys.path, _mock_interactive_session_cwd) # Add the desired session working directory sys.path.insert(0, _mock_interactive_session_cwd) > with subprocess_worker(protocol=self.protocol) as w: tests/cloudpickle_test.py:2743: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError _ Protocol2CloudPickleTest.test_deterministic_dynamic_class_attr_ordering_for_chained_pickling _ self = def test_deterministic_dynamic_class_attr_ordering_for_chained_pickling(self): # Check that the pickle produced by pickling a reconstructed class definition # in a remote process matches the pickle produced by pickling the original # class definition. # In particular, this test checks that the order of the class attributes is # deterministic. > with subprocess_worker(protocol=self.protocol) as w: tests/cloudpickle_test.py:2082: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError _ Protocol2CloudPickleTest.test_deterministic_str_interning_for_chained_dynamic_class_pickling _ self = def test_deterministic_str_interning_for_chained_dynamic_class_pickling(self): # Check that the pickle produced by the unpickled instance is the same. # This checks that there is no issue related to the string interning of # the names of attributes of class definitions and names of attributes # of the `__code__` objects of the methods. > with subprocess_worker(protocol=self.protocol) as w: tests/cloudpickle_test.py:2121: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError _ Protocol2CloudPickleTest.test_dynamic_class_determinist_subworker_tuple_memoization _ self = def test_dynamic_class_determinist_subworker_tuple_memoization(self): # Check that the pickle produced by the unpickled instance is the same. # This highlights some issues with tuple memoization. > with subprocess_worker(protocol=self.protocol) as w: tests/cloudpickle_test.py:2160: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError ______ Protocol2CloudPickleTest.test_dynamic_func_deterministic_roundtrip ______ self = def test_dynamic_func_deterministic_roundtrip(self): # Check that the pickle serialization for a dynamic func is the same # in two processes. def get_dynamic_func_pickle(): def test_method(arg_1, arg_2): pass return cloudpickle.dumps(test_method) > with subprocess_worker(protocol=self.protocol) as w: tests/cloudpickle_test.py:2048: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError ________________ Protocol2CloudPickleTest.test_generic_subclass ________________ self = def test_generic_subclass(self): T = typing.TypeVar("T") class Base(typing.Generic[T]): pass class DerivedAny(Base): pass class LeafAny(DerivedAny): pass class DerivedInt(Base[int]): pass class LeafInt(DerivedInt): pass class DerivedT(Base[T]): pass class LeafT(DerivedT[T]): pass klasses = [Base, DerivedAny, LeafAny, DerivedInt, LeafInt, DerivedT, LeafT] for klass in klasses: assert pickle_depickle(klass, protocol=self.protocol) is klass > with subprocess_worker(protocol=self.protocol) as worker: tests/cloudpickle_test.py:2633: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError __________________ Protocol2CloudPickleTest.test_generic_type __________________ self = def test_generic_type(self): T = typing.TypeVar("T") class C(typing.Generic[T]): pass assert pickle_depickle(C, protocol=self.protocol) is C # Identity is not part of the typing contract: only test for # equality instead. assert pickle_depickle(C[int], protocol=self.protocol) == C[int] > with subprocess_worker(protocol=self.protocol) as worker: tests/cloudpickle_test.py:2587: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError _ Protocol2CloudPickleTest.test_interactive_dynamic_type_and_remote_instances __ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n\n with subprocess_worker(protoco...ounter\n assert isinstance(c1, CustomCounter)\n assert isinstance(c2, CustomCounter)\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: > out = check_output(cmd, **kwargs) tests/testutils.py:204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/subprocess.py:466: in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = 60, check = True popenargs = (['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmp3kg1usna_src_test_cloudpickle.py'],) kwargs = {'cwd': '/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1', 'env': {'ABUILD_LAST_COMMIT': '', 'BUILDCC...ack-clash-protection -Wformat -Werror=format-security -fno-plt', 'BUILDCPPFLAGS': '', ...}, 'stderr': -2, 'stdout': -1} process = stdout = b'Traceback (most recent call last):\n File "/tmp/tmp3kg1usna_src_test_cloudpickle.py", line 4, in \n with...ocessing.SemLock(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\nPermissionError: [Errno 13] Permission denied\n' stderr = None, retcode = 1 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout (seconds) is given and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmp3kg1usna_src_test_cloudpickle.py']' returned non-zero exit status 1. /usr/lib/python3.12/subprocess.py:571: CalledProcessError The above exception was the direct cause of the following exception: self = def test_interactive_dynamic_type_and_remote_instances(self): code = """if __name__ == "__main__": from testutils import subprocess_worker with subprocess_worker(protocol={protocol}) as w: class CustomCounter: def __init__(self): self.count = 0 def increment(self): self.count += 1 return self counter = CustomCounter().increment() assert counter.count == 1 returned_counter = w.run(counter.increment) assert returned_counter.count == 2, returned_counter.count # Check that the class definition of the returned instance was # matched back to the original class definition living in __main__. assert isinstance(returned_counter, CustomCounter) # Check that memoization does not break provenance tracking: def echo(*args): return args C1, C2, c1, c2 = w.run(echo, CustomCounter, CustomCounter, CustomCounter(), returned_counter) assert C1 is CustomCounter assert C2 is CustomCounter assert isinstance(c1, CustomCounter) assert isinstance(c2, CustomCounter) """.format(protocol=self.protocol) > assert_run_python_script(code) tests/cloudpickle_test.py:1959: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n\n with subprocess_worker(protoco...ounter\n assert isinstance(c1, CustomCounter)\n assert isinstance(c2, CustomCounter)\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: out = check_output(cmd, **kwargs) except CalledProcessError as e: > raise RuntimeError( "script errored with output:\n%s" % e.output.decode("utf-8") ) from e E RuntimeError: script errored with output: E Traceback (most recent call last): E File "/tmp/tmp3kg1usna_src_test_cloudpickle.py", line 4, in E with subprocess_worker(protocol=2) as w: E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/contextlib.py", line 137, in __enter__ E return next(self.gen) E ^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 174, in subprocess_worker E worker = _Worker(protocol=protocol) E ^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 141, in __init__ E self.pool = ProcessPoolExecutor(max_workers=1) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 747, in __init__ E self._call_queue = _SafeQueue( E ^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 177, in __init__ E super().__init__(max_size, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/queues.py", line 43, in __init__ E self._rlock = ctx.Lock() E ^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/context.py", line 68, in Lock E return Lock(ctx=self.get_context()) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 169, in __init__ E SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 57, in __init__ E sl = self._semlock = _multiprocessing.SemLock( E ^^^^^^^^^^^^^^^^^^^^^^^^^ E PermissionError: [Errno 13] Permission denied tests/testutils.py:206: RuntimeError _ Protocol2CloudPickleTest.test_interactive_dynamic_type_and_stored_remote_instances _ source_code = 'if __name__ == "__main__":\n import cloudpickle, uuid\n from testutils import subprocess_worker\n\n ... class\n # method:\n assert w.run(lambda obj_id: lookup(obj_id).echo(43), id2) == 43\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: > out = check_output(cmd, **kwargs) tests/testutils.py:204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/subprocess.py:466: in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = 60, check = True popenargs = (['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmp4invdym__src_test_cloudpickle.py'],) kwargs = {'cwd': '/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1', 'env': {'ABUILD_LAST_COMMIT': '', 'BUILDCC...ack-clash-protection -Wformat -Werror=format-security -fno-plt', 'BUILDCPPFLAGS': '', ...}, 'stderr': -2, 'stdout': -1} process = stdout = b'Traceback (most recent call last):\n File "/tmp/tmp4invdym__src_test_cloudpickle.py", line 5, in \n with...ocessing.SemLock(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\nPermissionError: [Errno 13] Permission denied\n' stderr = None, retcode = 1 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout (seconds) is given and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmp4invdym__src_test_cloudpickle.py']' returned non-zero exit status 1. /usr/lib/python3.12/subprocess.py:571: CalledProcessError The above exception was the direct cause of the following exception: self = def test_interactive_dynamic_type_and_stored_remote_instances(self): """Simulate objects stored on workers to check isinstance semantics Such instances stored in the memory of running worker processes are similar to dask-distributed futures for instance. """ code = """if __name__ == "__main__": import cloudpickle, uuid from testutils import subprocess_worker with subprocess_worker(protocol={protocol}) as w: class A: '''Original class definition''' pass def store(x): storage = getattr(cloudpickle, "_test_storage", None) if storage is None: storage = cloudpickle._test_storage = dict() obj_id = uuid.uuid4().hex storage[obj_id] = x return obj_id def lookup(obj_id): return cloudpickle._test_storage[obj_id] id1 = w.run(store, A()) # The stored object on the worker is matched to a singleton class # definition thanks to provenance tracking: assert w.run(lambda obj_id: isinstance(lookup(obj_id), A), id1) # Retrieving the object from the worker yields a local copy that # is matched back the local class definition this instance # originally stems from. assert isinstance(w.run(lookup, id1), A) # Changing the local class definition should be taken into account # in all subsequent calls. In particular the old instances on the # worker do not map back to the new class definition, neither on # the worker itself, nor locally on the main program when the old # instance is retrieved: class A: '''Updated class definition''' assert not w.run(lambda obj_id: isinstance(lookup(obj_id), A), id1) retrieved1 = w.run(lookup, id1) assert not isinstance(retrieved1, A) assert retrieved1.__class__ is not A assert retrieved1.__class__.__doc__ == "Original class definition" # New instances on the other hand are proper instances of the new # class definition everywhere: a = A() id2 = w.run(store, a) assert w.run(lambda obj_id: isinstance(lookup(obj_id), A), id2) assert isinstance(w.run(lookup, id2), A) # Monkeypatch the class defintion in the main process to a new # class method: A.echo = lambda cls, x: x # Calling this method on an instance will automatically update # the remote class definition on the worker to propagate the monkey # patch dynamically. assert w.run(a.echo, 42) == 42 # The stored instance can therefore also access the new class # method: assert w.run(lambda obj_id: lookup(obj_id).echo(43), id2) == 43 """.format(protocol=self.protocol) > assert_run_python_script(code) tests/cloudpickle_test.py:2036: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ source_code = 'if __name__ == "__main__":\n import cloudpickle, uuid\n from testutils import subprocess_worker\n\n ... class\n # method:\n assert w.run(lambda obj_id: lookup(obj_id).echo(43), id2) == 43\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: out = check_output(cmd, **kwargs) except CalledProcessError as e: > raise RuntimeError( "script errored with output:\n%s" % e.output.decode("utf-8") ) from e E RuntimeError: script errored with output: E Traceback (most recent call last): E File "/tmp/tmp4invdym__src_test_cloudpickle.py", line 5, in E with subprocess_worker(protocol=2) as w: E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/contextlib.py", line 137, in __enter__ E return next(self.gen) E ^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 174, in subprocess_worker E worker = _Worker(protocol=protocol) E ^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 141, in __init__ E self.pool = ProcessPoolExecutor(max_workers=1) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 747, in __init__ E self._call_queue = _SafeQueue( E ^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 177, in __init__ E super().__init__(max_size, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/queues.py", line 43, in __init__ E self._rlock = ctx.Lock() E ^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/context.py", line 68, in Lock E return Lock(ctx=self.get_context()) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 169, in __init__ E SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 57, in __init__ E sl = self._semlock = _multiprocessing.SemLock( E ^^^^^^^^^^^^^^^^^^^^^^^^^ E PermissionError: [Errno 13] Permission denied tests/testutils.py:206: RuntimeError _______ Protocol2CloudPickleTest.test_interactive_remote_function_calls ________ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n\n def interactive_function(x):\n... # previous definition of `interactive_function`:\n\n assert w.run(wrapper_func, 41) == 40\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: > out = check_output(cmd, **kwargs) tests/testutils.py:204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/subprocess.py:466: in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = 60, check = True popenargs = (['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpe5q84aq7_src_test_cloudpickle.py'],) kwargs = {'cwd': '/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1', 'env': {'ABUILD_LAST_COMMIT': '', 'BUILDCC...ack-clash-protection -Wformat -Werror=format-security -fno-plt', 'BUILDCPPFLAGS': '', ...}, 'stderr': -2, 'stdout': -1} process = stdout = b'Traceback (most recent call last):\n File "/tmp/tmpe5q84aq7_src_test_cloudpickle.py", line 7, in \n with...ocessing.SemLock(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\nPermissionError: [Errno 13] Permission denied\n' stderr = None, retcode = 1 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout (seconds) is given and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpe5q84aq7_src_test_cloudpickle.py']' returned non-zero exit status 1. /usr/lib/python3.12/subprocess.py:571: CalledProcessError The above exception was the direct cause of the following exception: self = def test_interactive_remote_function_calls(self): code = """if __name__ == "__main__": from testutils import subprocess_worker def interactive_function(x): return x + 1 with subprocess_worker(protocol={protocol}) as w: assert w.run(interactive_function, 41) == 42 # Define a new function that will call an updated version of # the previously called function: def wrapper_func(x): return interactive_function(x) def interactive_function(x): return x - 1 # The change in the definition of interactive_function in the main # module of the main process should be reflected transparently # in the worker process: the worker process does not recall the # previous definition of `interactive_function`: assert w.run(wrapper_func, 41) == 40 """.format(protocol=self.protocol) > assert_run_python_script(code) tests/cloudpickle_test.py:1876: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n\n def interactive_function(x):\n... # previous definition of `interactive_function`:\n\n assert w.run(wrapper_func, 41) == 40\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: out = check_output(cmd, **kwargs) except CalledProcessError as e: > raise RuntimeError( "script errored with output:\n%s" % e.output.decode("utf-8") ) from e E RuntimeError: script errored with output: E Traceback (most recent call last): E File "/tmp/tmpe5q84aq7_src_test_cloudpickle.py", line 7, in E with subprocess_worker(protocol=2) as w: E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/contextlib.py", line 137, in __enter__ E return next(self.gen) E ^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 174, in subprocess_worker E worker = _Worker(protocol=protocol) E ^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 141, in __init__ E self.pool = ProcessPoolExecutor(max_workers=1) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 747, in __init__ E self._call_queue = _SafeQueue( E ^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 177, in __init__ E super().__init__(max_size, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/queues.py", line 43, in __init__ E self._rlock = ctx.Lock() E ^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/context.py", line 68, in Lock E return Lock(ctx=self.get_context()) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 169, in __init__ E SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 57, in __init__ E sl = self._semlock = _multiprocessing.SemLock( E ^^^^^^^^^^^^^^^^^^^^^^^^^ E PermissionError: [Errno 13] Permission denied tests/testutils.py:206: RuntimeError _ Protocol2CloudPickleTest.test_interactive_remote_function_calls_no_memory_leak _ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n import struct\n\n with su... # iterations instead of 100 as used now (100x more data)\n assert growth < 5e7, growth\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: > out = check_output(cmd, **kwargs) tests/testutils.py:204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/subprocess.py:466: in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = 60, check = True popenargs = (['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpgkuvj32z_src_test_cloudpickle.py'],) kwargs = {'cwd': '/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1', 'env': {'ABUILD_LAST_COMMIT': '', 'BUILDCC...ack-clash-protection -Wformat -Werror=format-security -fno-plt', 'BUILDCPPFLAGS': '', ...}, 'stderr': -2, 'stdout': -1} process = stdout = b'Traceback (most recent call last):\n File "/tmp/tmpgkuvj32z_src_test_cloudpickle.py", line 5, in \n with...ocessing.SemLock(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\nPermissionError: [Errno 13] Permission denied\n' stderr = None, retcode = 1 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout (seconds) is given and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpgkuvj32z_src_test_cloudpickle.py']' returned non-zero exit status 1. /usr/lib/python3.12/subprocess.py:571: CalledProcessError The above exception was the direct cause of the following exception: self = @pytest.mark.skipif( platform.python_implementation() == "PyPy", reason="Skip PyPy because memory grows too much", ) def test_interactive_remote_function_calls_no_memory_leak(self): code = """if __name__ == "__main__": from testutils import subprocess_worker import struct with subprocess_worker(protocol={protocol}) as w: reference_size = w.memsize() assert reference_size > 0 def make_big_closure(i): # Generate a byte string of size 1MB itemsize = len(struct.pack("l", 1)) data = struct.pack("l", i) * (int(1e6) // itemsize) def process_data(): return len(data) return process_data for i in range(100): func = make_big_closure(i) result = w.run(func) assert result == int(1e6), result import gc w.run(gc.collect) # By this time the worker process has processed 100MB worth of data # passed in the closures. The worker memory size should not have # grown by more than a few MB as closures are garbage collected at # the end of each remote function call. growth = w.memsize() - reference_size # For some reason, the memory growth after processing 100MB of # data is ~50MB on MacOS, and ~1MB on Linux, so the upper bound on # memory growth we use is only tight for MacOS. However, # - 50MB is still 2x lower than the expected memory growth in case # of a leak (which would be the total size of the processed data, # 100MB) # - the memory usage growth does not increase if using 10000 # iterations instead of 100 as used now (100x more data) assert growth < 5e7, growth """.format(protocol=self.protocol) > assert_run_python_script(code) tests/cloudpickle_test.py:2224: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n import struct\n\n with su... # iterations instead of 100 as used now (100x more data)\n assert growth < 5e7, growth\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: out = check_output(cmd, **kwargs) except CalledProcessError as e: > raise RuntimeError( "script errored with output:\n%s" % e.output.decode("utf-8") ) from e E RuntimeError: script errored with output: E Traceback (most recent call last): E File "/tmp/tmpgkuvj32z_src_test_cloudpickle.py", line 5, in E with subprocess_worker(protocol=2) as w: E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/contextlib.py", line 137, in __enter__ E return next(self.gen) E ^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 174, in subprocess_worker E worker = _Worker(protocol=protocol) E ^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 141, in __init__ E self.pool = ProcessPoolExecutor(max_workers=1) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 747, in __init__ E self._call_queue = _SafeQueue( E ^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 177, in __init__ E super().__init__(max_size, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/queues.py", line 43, in __init__ E self._rlock = ctx.Lock() E ^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/context.py", line 68, in Lock E return Lock(ctx=self.get_context()) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 169, in __init__ E SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 57, in __init__ E sl = self._semlock = _multiprocessing.SemLock( E ^^^^^^^^^^^^^^^^^^^^^^^^^ E PermissionError: [Errno 13] Permission denied tests/testutils.py:206: RuntimeError _ Protocol2CloudPickleTest.test_interactive_remote_function_calls_no_side_effect _ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n import sys\n\n with subpr... assert is_in_main("GLOBAL_VARIABLE")\n assert not w.run(is_in_main, "GLOBAL_VARIABLE")\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: > out = check_output(cmd, **kwargs) tests/testutils.py:204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/subprocess.py:466: in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = 60, check = True popenargs = (['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpkxd639lf_src_test_cloudpickle.py'],) kwargs = {'cwd': '/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1', 'env': {'ABUILD_LAST_COMMIT': '', 'BUILDCC...ack-clash-protection -Wformat -Werror=format-security -fno-plt', 'BUILDCPPFLAGS': '', ...}, 'stderr': -2, 'stdout': -1} process = stdout = b'Traceback (most recent call last):\n File "/tmp/tmpkxd639lf_src_test_cloudpickle.py", line 5, in \n with...ocessing.SemLock(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\nPermissionError: [Errno 13] Permission denied\n' stderr = None, retcode = 1 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout (seconds) is given and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpkxd639lf_src_test_cloudpickle.py']' returned non-zero exit status 1. /usr/lib/python3.12/subprocess.py:571: CalledProcessError The above exception was the direct cause of the following exception: self = def test_interactive_remote_function_calls_no_side_effect(self): code = """if __name__ == "__main__": from testutils import subprocess_worker import sys with subprocess_worker(protocol={protocol}) as w: GLOBAL_VARIABLE = 0 class CustomClass(object): def mutate_globals(self): global GLOBAL_VARIABLE GLOBAL_VARIABLE += 1 return GLOBAL_VARIABLE custom_object = CustomClass() assert w.run(custom_object.mutate_globals) == 1 # The caller global variable is unchanged in the main process. assert GLOBAL_VARIABLE == 0 # Calling the same function again starts again from zero. The # worker process is stateless: it has no memory of the past call: assert w.run(custom_object.mutate_globals) == 1 # The symbols defined in the main process __main__ module are # not set in the worker process main module to leave the worker # as stateless as possible: def is_in_main(name): return hasattr(sys.modules["__main__"], name) assert is_in_main("CustomClass") assert not w.run(is_in_main, "CustomClass") assert is_in_main("GLOBAL_VARIABLE") assert not w.run(is_in_main, "GLOBAL_VARIABLE") """.format(protocol=self.protocol) > assert_run_python_script(code) tests/cloudpickle_test.py:1920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ source_code = 'if __name__ == "__main__":\n from testutils import subprocess_worker\n import sys\n\n with subpr... assert is_in_main("GLOBAL_VARIABLE")\n assert not w.run(is_in_main, "GLOBAL_VARIABLE")\n\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: out = check_output(cmd, **kwargs) except CalledProcessError as e: > raise RuntimeError( "script errored with output:\n%s" % e.output.decode("utf-8") ) from e E RuntimeError: script errored with output: E Traceback (most recent call last): E File "/tmp/tmpkxd639lf_src_test_cloudpickle.py", line 5, in E with subprocess_worker(protocol=2) as w: E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/contextlib.py", line 137, in __enter__ E return next(self.gen) E ^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 174, in subprocess_worker E worker = _Worker(protocol=protocol) E ^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 141, in __init__ E self.pool = ProcessPoolExecutor(max_workers=1) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 747, in __init__ E self._call_queue = _SafeQueue( E ^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 177, in __init__ E super().__init__(max_size, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/queues.py", line 43, in __init__ E self._rlock = ctx.Lock() E ^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/context.py", line 68, in Lock E return Lock(ctx=self.get_context()) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 169, in __init__ E SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 57, in __init__ E sl = self._semlock = _multiprocessing.SemLock( E ^^^^^^^^^^^^^^^^^^^^^^^^^ E PermissionError: [Errno 13] Permission denied tests/testutils.py:206: RuntimeError _ Protocol2CloudPickleTest.test_interactively_defined_dataclass_with_initvar_and_classvar _ source_code = 'if __name__ == "__main__":\n import dataclasses\n from testutils import subprocess_worker\n impo... assert cloned_type is SampleDataclass\n assert isinstance(cloned_value, SampleDataclass)\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: > out = check_output(cmd, **kwargs) tests/testutils.py:204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/subprocess.py:466: in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = 60, check = True popenargs = (['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpc5govo9p_src_test_cloudpickle.py'],) kwargs = {'cwd': '/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1', 'env': {'ABUILD_LAST_COMMIT': '', 'BUILDCC...ack-clash-protection -Wformat -Werror=format-security -fno-plt', 'BUILDCPPFLAGS': '', ...}, 'stderr': -2, 'stdout': -1} process = stdout = b'Traceback (most recent call last):\n File "/tmp/tmpc5govo9p_src_test_cloudpickle.py", line 6, in \n with...ocessing.SemLock(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\nPermissionError: [Errno 13] Permission denied\n' stderr = None, retcode = 1 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout (seconds) is given and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmpc5govo9p_src_test_cloudpickle.py']' returned non-zero exit status 1. /usr/lib/python3.12/subprocess.py:571: CalledProcessError The above exception was the direct cause of the following exception: self = def test_interactively_defined_dataclass_with_initvar_and_classvar(self): code = """if __name__ == "__main__": import dataclasses from testutils import subprocess_worker import typing with subprocess_worker(protocol={protocol}) as w: @dataclasses.dataclass class SampleDataclass: x: int y: dataclasses.InitVar[int] = None z: typing.ClassVar[int] = 42 def __post_init__(self, y=0): self.x += y def large_enough(self): return self.x > self.z value = SampleDataclass(2, y=2) def check_dataclass_instance(value): assert isinstance(value, SampleDataclass) assert value.x == 4 assert value.z == 42 expected_dict = dict(x=4) assert dataclasses.asdict(value) == expected_dict assert not value.large_enough() try: SampleDataclass.z = 0 assert value.z == 0 assert value.large_enough() finally: SampleDataclass.z = 42 return "ok" assert check_dataclass_instance(value) == "ok" # Check that this instance of an interactively defined dataclass # behavesconsistently in a remote worker process: assert w.run(check_dataclass_instance, value) == "ok" # Check class provenance tracking is not impacted by the # @dataclass decorator: def echo(*args): return args cloned_value, cloned_type = w.run(echo, value, SampleDataclass) assert cloned_type is SampleDataclass assert isinstance(cloned_value, SampleDataclass) """.format(protocol=self.protocol) > assert_run_python_script(code) tests/cloudpickle_test.py:3035: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ source_code = 'if __name__ == "__main__":\n import dataclasses\n from testutils import subprocess_worker\n impo... assert cloned_type is SampleDataclass\n assert isinstance(cloned_value, SampleDataclass)\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: out = check_output(cmd, **kwargs) except CalledProcessError as e: > raise RuntimeError( "script errored with output:\n%s" % e.output.decode("utf-8") ) from e E RuntimeError: script errored with output: E Traceback (most recent call last): E File "/tmp/tmpc5govo9p_src_test_cloudpickle.py", line 6, in E with subprocess_worker(protocol=2) as w: E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/contextlib.py", line 137, in __enter__ E return next(self.gen) E ^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 174, in subprocess_worker E worker = _Worker(protocol=protocol) E ^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 141, in __init__ E self.pool = ProcessPoolExecutor(max_workers=1) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 747, in __init__ E self._call_queue = _SafeQueue( E ^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 177, in __init__ E super().__init__(max_size, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/queues.py", line 43, in __init__ E self._rlock = ctx.Lock() E ^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/context.py", line 68, in Lock E return Lock(ctx=self.get_context()) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 169, in __init__ E SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 57, in __init__ E sl = self._semlock = _multiprocessing.SemLock( E ^^^^^^^^^^^^^^^^^^^^^^^^^ E PermissionError: [Errno 13] Permission denied tests/testutils.py:206: RuntimeError ___________ Protocol2CloudPickleTest.test_interactively_defined_enum ___________ source_code = 'if __name__ == "__main__":\n from enum import Enum\n from testutils import subprocess_worker\n\n ...= 0 else Color.RED\n\n result = w.run(check_positive, 1)\n assert result is Color.BLUE\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: > out = check_output(cmd, **kwargs) tests/testutils.py:204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/subprocess.py:466: in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ input = None, capture_output = False, timeout = 60, check = True popenargs = (['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmp3s7c1sir_src_test_cloudpickle.py'],) kwargs = {'cwd': '/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1', 'env': {'ABUILD_LAST_COMMIT': '', 'BUILDCC...ack-clash-protection -Wformat -Werror=format-security -fno-plt', 'BUILDCPPFLAGS': '', ...}, 'stderr': -2, 'stdout': -1} process = stdout = b'Traceback (most recent call last):\n File "/tmp/tmp3s7c1sir_src_test_cloudpickle.py", line 5, in \n with...ocessing.SemLock(\n ^^^^^^^^^^^^^^^^^^^^^^^^^\nPermissionError: [Errno 13] Permission denied\n' stderr = None, retcode = 1 def run(*popenargs, input=None, capture_output=False, timeout=None, check=False, **kwargs): """Run command with arguments and return a CompletedProcess instance. The returned instance will have attributes args, returncode, stdout and stderr. By default, stdout and stderr are not captured, and those attributes will be None. Pass stdout=PIPE and/or stderr=PIPE in order to capture them, or pass capture_output=True to capture both. If check is True and the exit code was non-zero, it raises a CalledProcessError. The CalledProcessError object will have the return code in the returncode attribute, and output & stderr attributes if those streams were captured. If timeout (seconds) is given and the process takes too long, a TimeoutExpired exception will be raised. There is an optional argument "input", allowing you to pass bytes or a string to the subprocess's stdin. If you use this argument you may not also use the Popen constructor's "stdin" argument, as it will be used internally. By default, all communication is in bytes, and therefore any "input" should be bytes, and the stdout and stderr will be bytes. If in text mode, any "input" should be a string, and stdout and stderr will be strings decoded according to locale encoding, or by "encoding" if set. Text mode is triggered by setting any of text, encoding, errors or universal_newlines. The other arguments are the same as for the Popen constructor. """ if input is not None: if kwargs.get('stdin') is not None: raise ValueError('stdin and input arguments may not both be used.') kwargs['stdin'] = PIPE if capture_output: if kwargs.get('stdout') is not None or kwargs.get('stderr') is not None: raise ValueError('stdout and stderr arguments may not be used ' 'with capture_output.') kwargs['stdout'] = PIPE kwargs['stderr'] = PIPE with Popen(*popenargs, **kwargs) as process: try: stdout, stderr = process.communicate(input, timeout=timeout) except TimeoutExpired as exc: process.kill() if _mswindows: # Windows accumulates the output in a single blocking # read() call run on child threads, with the timeout # being done in a join() on those threads. communicate() # _after_ kill() is required to collect that and add it # to the exception. exc.stdout, exc.stderr = process.communicate() else: # POSIX _communicate already populated the output so # far into the TimeoutExpired exception. process.wait() raise except: # Including KeyboardInterrupt, communicate handled that. process.kill() # We don't call process.wait() as .__exit__ does that for us. raise retcode = process.poll() if check and retcode: > raise CalledProcessError(retcode, process.args, output=stdout, stderr=stderr) E subprocess.CalledProcessError: Command '['/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/.testenv/bin/python3', '-W ignore', '/tmp/tmp3s7c1sir_src_test_cloudpickle.py']' returned non-zero exit status 1. /usr/lib/python3.12/subprocess.py:571: CalledProcessError The above exception was the direct cause of the following exception: self = def test_interactively_defined_enum(self): code = """if __name__ == "__main__": from enum import Enum from testutils import subprocess_worker with subprocess_worker(protocol={protocol}) as w: class Color(Enum): RED = 1 GREEN = 2 def check_positive(x): return Color.GREEN if x >= 0 else Color.RED result = w.run(check_positive, 1) # Check that the returned enum instance is reconciled with the # locally defined Color enum type definition: assert result is Color.GREEN # Check that changing the definition of the Enum class is taken # into account on the worker for subsequent calls: class Color(Enum): RED = 1 BLUE = 2 def check_positive(x): return Color.BLUE if x >= 0 else Color.RED result = w.run(check_positive, 1) assert result is Color.BLUE """.format(protocol=self.protocol) > assert_run_python_script(code) tests/cloudpickle_test.py:2426: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ source_code = 'if __name__ == "__main__":\n from enum import Enum\n from testutils import subprocess_worker\n\n ...= 0 else Color.RED\n\n result = w.run(check_positive, 1)\n assert result is Color.BLUE\n ' timeout = 60 def assert_run_python_script(source_code, timeout=TIMEOUT): """Utility to help check pickleability of objects defined in __main__ The script provided in the source code should return 0 and not print anything on stderr or stdout. """ fd, source_file = tempfile.mkstemp(suffix="_src_test_cloudpickle.py") os.close(fd) try: with open(source_file, "wb") as f: f.write(source_code.encode("utf-8")) cmd = [sys.executable, "-W ignore", source_file] cwd, env = _make_cwd_env() kwargs = { "cwd": cwd, "stderr": STDOUT, "env": env, } # If coverage is running, pass the config file to the subprocess coverage_rc = os.environ.get("COVERAGE_PROCESS_START") if coverage_rc: kwargs["env"]["COVERAGE_PROCESS_START"] = coverage_rc kwargs["timeout"] = timeout try: try: out = check_output(cmd, **kwargs) except CalledProcessError as e: > raise RuntimeError( "script errored with output:\n%s" % e.output.decode("utf-8") ) from e E RuntimeError: script errored with output: E Traceback (most recent call last): E File "/tmp/tmp3s7c1sir_src_test_cloudpickle.py", line 5, in E with subprocess_worker(protocol=2) as w: E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/contextlib.py", line 137, in __enter__ E return next(self.gen) E ^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 174, in subprocess_worker E worker = _Worker(protocol=protocol) E ^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/tests/testutils.py", line 141, in __init__ E self.pool = ProcessPoolExecutor(max_workers=1) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 747, in __init__ E self._call_queue = _SafeQueue( E ^^^^^^^^^^^ E File "/usr/lib/python3.12/concurrent/futures/process.py", line 177, in __init__ E super().__init__(max_size, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/queues.py", line 43, in __init__ E self._rlock = ctx.Lock() E ^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/context.py", line 68, in Lock E return Lock(ctx=self.get_context()) E ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 169, in __init__ E SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) E File "/usr/lib/python3.12/multiprocessing/synchronize.py", line 57, in __init__ E sl = self._semlock = _multiprocessing.SemLock( E ^^^^^^^^^^^^^^^^^^^^^^^^^ E PermissionError: [Errno 13] Permission denied tests/testutils.py:206: RuntimeError _____ Protocol2CloudPickleTest.test_locally_defined_class_with_type_hints ______ self = def test_locally_defined_class_with_type_hints(self): > with subprocess_worker(protocol=self.protocol) as worker: tests/cloudpickle_test.py:2645: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError __________ Protocol2CloudPickleTest.test_multiprocessing_lock_raises ___________ self = def test_multiprocessing_lock_raises(self): > lock = multiprocessing.Lock() tests/cloudpickle_test.py:1180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError _ Protocol2CloudPickleTest.test_pickle_constructs_from_module_registered_for_pickling_by_value _ self = def test_pickle_constructs_from_module_registered_for_pickling_by_value( self, ): # noqa _prev_sys_path = sys.path.copy() try: # We simulate an interactive session that: # - we start from the /path/to/cloudpickle/tests directory, where a # local .py file (mock_local_file) is located. # - uses constructs from mock_local_file in remote workers that do # not have access to this file. This situation is # the justification behind the # (un)register_pickle_by_value(module) api that cloudpickle # exposes. _mock_interactive_session_cwd = os.path.dirname(__file__) # First, remove sys.path entries that could point to # /path/to/cloudpickle/tests and be in inherited by the worker _maybe_remove(sys.path, "") _maybe_remove(sys.path, _mock_interactive_session_cwd) # Add the desired session working directory sys.path.insert(0, _mock_interactive_session_cwd) > with subprocess_worker(protocol=self.protocol) as w: tests/cloudpickle_test.py:2743: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.12/contextlib.py:137: in __enter__ return next(self.gen) tests/testutils.py:174: in subprocess_worker worker = _Worker(protocol=protocol) tests/testutils.py:141: in __init__ self.pool = ProcessPoolExecutor(max_workers=1) /usr/lib/python3.12/concurrent/futures/process.py:747: in __init__ self._call_queue = _SafeQueue( /usr/lib/python3.12/concurrent/futures/process.py:177: in __init__ super().__init__(max_size, ctx=ctx) /usr/lib/python3.12/multiprocessing/queues.py:43: in __init__ self._rlock = ctx.Lock() /usr/lib/python3.12/multiprocessing/context.py:68: in Lock return Lock(ctx=self.get_context()) /usr/lib/python3.12/multiprocessing/synchronize.py:169: in __init__ SemLock.__init__(self, SEMAPHORE, 1, 1, ctx=ctx) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , kind = 1, value = 1, maxvalue = 1 def __init__(self, kind, value, maxvalue, *, ctx): if ctx is None: ctx = context._default_context.get_context() self._is_fork_ctx = ctx.get_start_method() == 'fork' unlink_now = sys.platform == 'win32' or self._is_fork_ctx for i in range(100): try: > sl = self._semlock = _multiprocessing.SemLock( kind, value, maxvalue, self._make_name(), unlink_now) E PermissionError: [Errno 13] Permission denied /usr/lib/python3.12/multiprocessing/synchronize.py:57: PermissionError =============================== warnings summary =============================== tests/cloudpickle_test.py::CloudPickleTest::test_itertools_count tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_itertools_count /home/udu/aports/community/py3-cloudpickle/src/cloudpickle-3.1.1/cloudpickle/cloudpickle.py:1303: DeprecationWarning: Pickle, copy, and deepcopy support will be removed from itertools in Python 3.14. return super().dump(obj) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info ============================ FAILED tests/cloudpickle_test.py::CloudPickleTest::test_deterministic_dynamic_class_attr_ordering_for_chained_pickling FAILED tests/cloudpickle_test.py::CloudPickleTest::test_deterministic_str_interning_for_chained_dynamic_class_pickling FAILED tests/cloudpickle_test.py::CloudPickleTest::test_dynamic_class_determinist_subworker_tuple_memoization FAILED tests/cloudpickle_test.py::CloudPickleTest::test_dynamic_func_deterministic_roundtrip FAILED tests/cloudpickle_test.py::CloudPickleTest::test_generic_subclass - Pe... FAILED tests/cloudpickle_test.py::CloudPickleTest::test_generic_type - Permis... FAILED tests/cloudpickle_test.py::CloudPickleTest::test_interactive_dynamic_type_and_remote_instances FAILED tests/cloudpickle_test.py::CloudPickleTest::test_interactive_dynamic_type_and_stored_remote_instances FAILED tests/cloudpickle_test.py::CloudPickleTest::test_interactive_remote_function_calls FAILED tests/cloudpickle_test.py::CloudPickleTest::test_interactive_remote_function_calls_no_memory_leak FAILED tests/cloudpickle_test.py::CloudPickleTest::test_interactive_remote_function_calls_no_side_effect FAILED tests/cloudpickle_test.py::CloudPickleTest::test_interactively_defined_dataclass_with_initvar_and_classvar FAILED tests/cloudpickle_test.py::CloudPickleTest::test_interactively_defined_enum FAILED tests/cloudpickle_test.py::CloudPickleTest::test_locally_defined_class_with_type_hints FAILED tests/cloudpickle_test.py::CloudPickleTest::test_multiprocessing_lock_raises FAILED tests/cloudpickle_test.py::CloudPickleTest::test_pickle_constructs_from_module_registered_for_pickling_by_value FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_deterministic_dynamic_class_attr_ordering_for_chained_pickling FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_deterministic_str_interning_for_chained_dynamic_class_pickling FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_dynamic_class_determinist_subworker_tuple_memoization FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_dynamic_func_deterministic_roundtrip FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_generic_subclass FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_generic_type FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_interactive_dynamic_type_and_remote_instances FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_interactive_dynamic_type_and_stored_remote_instances FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_interactive_remote_function_calls FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_interactive_remote_function_calls_no_memory_leak FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_interactive_remote_function_calls_no_side_effect FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_interactively_defined_dataclass_with_initvar_and_classvar FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_interactively_defined_enum FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_locally_defined_class_with_type_hints FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_multiprocessing_lock_raises FAILED tests/cloudpickle_test.py::Protocol2CloudPickleTest::test_pickle_constructs_from_module_registered_for_pickling_by_value ====== 32 failed, 214 passed, 24 skipped, 2 xfailed, 2 warnings in 8.41s ======= >>> ERROR: py3-cloudpickle: check failed >>> py3-cloudpickle: Uninstalling dependencies... (1/33) Purging .makedepends-py3-cloudpickle (20251012.154408) (2/33) Purging py3-flit-core-pyc (3.12.0-r0) (3/33) Purging py3-flit-core (3.12.0-r0) (4/33) Purging py3-gpep517-pyc (19-r0) (5/33) Purging py3-gpep517 (19-r0) (6/33) Purging py3-installer-pyc (0.7.0-r2) (7/33) Purging py3-installer (0.7.0-r2) (8/33) Purging py3-wheel-pyc (0.46.1-r0) (9/33) Purging py3-wheel (0.46.1-r0) (10/33) Purging py3-pytest-pyc (8.3.5-r0) (11/33) Purging py3-pytest (8.3.5-r0) (12/33) Purging py3-iniconfig-pyc (2.1.0-r0) (13/33) Purging py3-iniconfig (2.1.0-r0) (14/33) Purging py3-packaging-pyc (25.0-r0) (15/33) Purging py3-packaging (25.0-r0) (16/33) Purging py3-parsing-pyc (3.2.3-r0) (17/33) Purging py3-parsing (3.2.3-r0) (18/33) Purging py3-pluggy-pyc (1.5.0-r0) (19/33) Purging py3-pluggy (1.5.0-r0) (20/33) Purging py3-py-pyc (1.11.0-r4) (21/33) Purging py3-py (1.11.0-r4) (22/33) Purging py3-psutil-pyc (7.0.0-r0) (23/33) Purging py3-psutil (7.0.0-r0) (24/33) Purging py3-tornado-pyc (6.5.1-r0) (25/33) Purging py3-tornado (6.5.1-r0) (26/33) Purging py3-typing-extensions-pyc (4.13.2-r0) (27/33) Purging py3-typing-extensions (4.13.2-r0) (28/33) Purging py3-numpy-tests (2.2.4-r1) (29/33) Purging py3-numpy-pyc (2.2.4-r1) (30/33) Purging py3-numpy (2.2.4-r1) (31/33) Purging openblas (0.3.28-r0) (32/33) Purging libgfortran (14.2.0-r6) (33/33) Purging libquadmath (14.2.0-r6) Executing busybox-1.37.0-r19.trigger OK: 296 MiB in 89 packages