==> Building on articuno ==> Checking for remote environment... ==> Syncing package to remote host... sending incremental file list ./ .SRCINFO 654 100% 0.00kB/s 0:00:00 654 100% 0.00kB/s 0:00:00 (xfr#1, to-chk=3/5) .nvchecker.toml 114 100% 111.33kB/s 0:00:00 114 100% 111.33kB/s 0:00:00 (xfr#2, to-chk=2/5) PKGBUILD 872 100% 851.56kB/s 0:00:00 872 100% 851.56kB/s 0:00:00 (xfr#3, to-chk=1/5) python-digitalocean-1.17.0-7.log 454 100% 443.36kB/s 0:00:00 454 100% 443.36kB/s 0:00:00 (xfr#4, to-chk=0/5) sent 1,412 bytes received 113 bytes 1,016.67 bytes/sec total size is 1,835 speedup is 1.20 ==> Running pkgctl build --arch riscv64 --repo extra on remote host... ==> WARNING: unsupported architecture: riscv64 ==> Building python-digitalocean  -> repo: extra  -> arch: riscv64  -> worker: felix-2 ==> Building python-digitalocean for [extra] (riscv64) ]2;🔵 Container arch-nspawn-1891856 on articuno.felixc.at\[?25l:: Synchronizing package databases... core downloading... extra downloading... :: Starting full system upgrade... there is nothing to do [?25h==> Building in chroot for [extra] (riscv64)... ==> Locking clean chroot [/var/lib/archbuild/extra-riscv64/root]...done ==> Synchronizing chroot copy [/var/lib/archbuild/extra-riscv64/root] -> [felix-2]...done ==> Making package: python-digitalocean 1.17.0-7 (Sat Jan 11 17:07:30 2025) ==> Retrieving sources...  -> Found python-digitalocean-1.17.0.tar.gz ==> Validating source files with sha512sums... python-digitalocean-1.17.0.tar.gz ... Passed ]2;🔵 Container arch-nspawn-1903457 on articuno.felixc.at\==> Making package: python-digitalocean 1.17.0-7 (Sat Jan 11 17:07:52 2025) ==> Checking runtime dependencies... ==> Installing missing dependencies... [?25lresolving dependencies... looking for conflicting packages... Package (7) New Version Net Change Download Size extra/mpdecimal 4.0.0-2 0.29 MiB core/python 3.13.1-1 108.57 MiB extra/python-charset-normalizer 3.4.1-1 0.44 MiB extra/python-idna 3.10-2 0.88 MiB extra/python-urllib3 2.3.0-1 1.26 MiB extra/python-jsonpickle 3.3.0-2 0.43 MiB 0.08 MiB extra/python-requests 2.32.3-4.1 0.60 MiB Total Download Size: 0.08 MiB Total Installed Size: 112.46 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... python-jsonpickle-3.3.0-2-any downloading... checking keyring... checking package integrity... loading package files... checking for file conflicts... :: Processing package changes... installing mpdecimal... installing python... Optional dependencies for python python-setuptools: for building Python packages using tooling that is usually bundled with Python python-pip: for installing Python packages using tooling that is usually bundled with Python python-pipx: for installing Python software not packaged on Arch Linux sqlite: for a default database integration [installed] xz: for lzma [installed] tk: for tkinter installing python-charset-normalizer... installing python-idna... installing python-urllib3... Optional dependencies for python-urllib3 python-brotli: Brotli support python-brotlicffi: Brotli support python-h2: HTTP/2 support python-pysocks: SOCKS support python-zstandard: Zstandard support installing python-requests... Optional dependencies for python-requests python-chardet: alternative character encoding library python-pysocks: SOCKS proxy support installing python-jsonpickle... Optional dependencies for python-jsonpickle python-numpy: for serializing sklearn models, numpy arrays, and other numpy-based data python-gmpy2: for serializing ecdsa module's keys :: Running post-transaction hooks... (1/1) Arming ConditionNeedsUpdate... [?25h==> Checking buildtime dependencies... ==> Installing missing dependencies... [?25lresolving dependencies... looking for conflicting packages... Package (16) New Version Net Change Download Size extra/libyaml 0.2.5-3 0.16 MiB extra/python-autocommand 2.2.2-7 0.08 MiB extra/python-iniconfig 2.0.0-6 0.04 MiB extra/python-jaraco.collections 5.1.0-1 0.10 MiB extra/python-jaraco.context 6.0.1-1 0.04 MiB extra/python-jaraco.functools 4.1.0-1 0.07 MiB extra/python-jaraco.text 4.0.0-2 0.08 MiB extra/python-more-itertools 10.5.0-1 0.64 MiB extra/python-packaging 24.2-3 0.66 MiB extra/python-platformdirs 4.3.6-2 0.24 MiB extra/python-pluggy 1.5.0-3 0.20 MiB extra/python-wheel 0.45.0-3 0.28 MiB extra/python-yaml 6.0.2-2 0.91 MiB extra/python-pytest 1:8.3.4-1 3.92 MiB extra/python-responses 0.25.3-2 0.77 MiB 0.11 MiB extra/python-setuptools 1:75.2.0-4 8.05 MiB Total Download Size: 0.11 MiB Total Installed Size: 16.25 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... python-responses-0.25.3-2-any downloading... checking keyring... checking package integrity... loading package files... checking for file conflicts... :: Processing package changes... installing python-more-itertools... installing python-jaraco.functools... installing python-jaraco.context... installing python-autocommand... installing python-jaraco.text... Optional dependencies for python-jaraco.text python-inflect: for show-newlines script installing python-jaraco.collections... installing python-packaging... installing python-platformdirs... installing python-wheel... Optional dependencies for python-wheel python-keyring: for wheel.signatures python-xdg: for wheel.signatures python-setuptools: for legacy bdist_wheel subcommand [pending] installing python-setuptools... installing python-iniconfig... installing python-pluggy... installing python-pytest... installing libyaml... installing python-yaml... installing python-responses... :: Running post-transaction hooks... (1/1) Arming ConditionNeedsUpdate... [?25h==> Retrieving sources...  -> Found python-digitalocean-1.17.0.tar.gz ==> WARNING: Skipping all source file integrity checks. ==> Extracting sources...  -> Extracting python-digitalocean-1.17.0.tar.gz with bsdtar ==> Starting build()... /usr/lib/python3.13/site-packages/setuptools/_distutils/dist.py:261: UserWarning: Unknown distribution option: 'test_suite' warnings.warn(msg) running build running build_py creating build/lib/digitalocean copying digitalocean/Manager.py -> build/lib/digitalocean copying digitalocean/baseapi.py -> build/lib/digitalocean copying digitalocean/SSHKey.py -> build/lib/digitalocean copying digitalocean/Size.py -> build/lib/digitalocean copying digitalocean/Record.py -> build/lib/digitalocean copying digitalocean/Action.py -> build/lib/digitalocean copying digitalocean/Droplet.py -> build/lib/digitalocean copying digitalocean/Volume.py -> build/lib/digitalocean copying digitalocean/Certificate.py -> build/lib/digitalocean copying digitalocean/Balance.py -> build/lib/digitalocean copying digitalocean/__init__.py -> build/lib/digitalocean copying digitalocean/Project.py -> build/lib/digitalocean copying digitalocean/Metadata.py -> build/lib/digitalocean copying digitalocean/LoadBalancer.py -> build/lib/digitalocean copying digitalocean/VPC.py -> build/lib/digitalocean copying digitalocean/Kernel.py -> build/lib/digitalocean copying digitalocean/Snapshot.py -> build/lib/digitalocean copying digitalocean/Domain.py -> build/lib/digitalocean copying digitalocean/Account.py -> build/lib/digitalocean copying digitalocean/Image.py -> build/lib/digitalocean copying digitalocean/Region.py -> build/lib/digitalocean copying digitalocean/Tag.py -> build/lib/digitalocean copying digitalocean/Firewall.py -> build/lib/digitalocean copying digitalocean/FloatingIP.py -> build/lib/digitalocean ==> Starting check()... ============================= test session starts ============================== platform linux -- Python 3.13.1, pytest-8.3.4, pluggy-1.5.0 rootdir: /build/python-digitalocean/src/python-digitalocean-1.17.0 collected 154 items digitalocean/tests/test_action.py .. [ 1%] digitalocean/tests/test_baseapi.py .... [ 3%] digitalocean/tests/test_certificate.py .... [ 6%] digitalocean/tests/test_domain.py ......... [ 12%] digitalocean/tests/test_droplet.py ..................................... [ 36%] ....... [ 40%] digitalocean/tests/test_firewall.py FF.FF [ 44%] digitalocean/tests/test_floatingip.py ...... [ 48%] digitalocean/tests/test_image.py ....... [ 52%] digitalocean/tests/test_load_balancer.py .......... [ 59%] digitalocean/tests/test_manager.py ............................... [ 79%] digitalocean/tests/test_project.py ......... [ 85%] digitalocean/tests/test_snapshot.py .. [ 86%] digitalocean/tests/test_tag.py ....... [ 90%] digitalocean/tests/test_volume.py .......... [ 97%] digitalocean/tests/test_vpc.py .... [100%] =================================== FAILURES =================================== ________________________ TestFirewall.test_add_droplets ________________________ self = @contextmanager def _error_catcher(self) -> typing.Generator[None]: """ Catch low-level python exceptions, instead re-raising urllib3 variants, so that low-level exceptions are not leaked in the high-level api. On exit, release the connection back to the pool. """ clean_exit = False try: try: > yield /usr/lib/python3.13/site-packages/urllib3/response.py:754: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , amt = 10240 def _raw_read( self, amt: int | None = None, *, read1: bool = False, ) -> bytes: """ Reads `amt` of bytes from the socket. """ if self._fp is None: return None # type: ignore[return-value] fp_closed = getattr(self._fp, "closed", False) with self._error_catcher(): data = self._fp_read(amt, read1=read1) if not fp_closed else b"" if amt is not None and amt != 0 and not data: # Platform-specific: Buggy versions of Python. # Close the connection when no data is returned # # This is redundant to what httplib/http.client _should_ # already do. However, versions of python released before # December 15, 2012 (http://bugs.python.org/issue16298) do # not properly close the connection in all cases. There is # no harm in redundantly calling close. self._fp.close() if ( self.enforce_content_length and self.length_remaining is not None and self.length_remaining != 0 ): # This is an edge case that httplib failed to cover due # to concerns of backward compatibility. We're # addressing it here to make sure IncompleteRead is # raised during streaming, so all calls with incorrect # Content-Length are caught. > raise IncompleteRead(self._fp_bytes_read, self.length_remaining) E urllib3.exceptions.IncompleteRead: IncompleteRead(44 bytes read, -44 more expected) /usr/lib/python3.13/site-packages/urllib3/response.py:900: IncompleteRead The above exception was the direct cause of the following exception: def generate(): # Special case for urllib3. if hasattr(self.raw, "stream"): try: > yield from self.raw.stream(chunk_size, decode_content=True) /usr/lib/python3.13/site-packages/requests/models.py:820: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/response.py:1066: in stream data = self.read(amt=amt, decode_content=decode_content) /usr/lib/python3.13/site-packages/urllib3/response.py:983: in read data = self._raw_read(amt) /usr/lib/python3.13/site-packages/urllib3/response.py:878: in _raw_read with self._error_catcher(): /usr/lib/python3.13/contextlib.py:162: in __exit__ self.gen.throw(value) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = @contextmanager def _error_catcher(self) -> typing.Generator[None]: """ Catch low-level python exceptions, instead re-raising urllib3 variants, so that low-level exceptions are not leaked in the high-level api. On exit, release the connection back to the pool. """ clean_exit = False try: try: yield except SocketTimeout as e: # FIXME: Ideally we'd like to include the url in the ReadTimeoutError but # there is yet no clean way to get at it from this context. raise ReadTimeoutError(self._pool, None, "Read timed out.") from e # type: ignore[arg-type] except BaseSSLError as e: # FIXME: Is there a better way to differentiate between SSLErrors? if "read operation timed out" not in str(e): # SSL errors related to framing/MAC get wrapped and reraised here raise SSLError(e) from e raise ReadTimeoutError(self._pool, None, "Read timed out.") from e # type: ignore[arg-type] except IncompleteRead as e: if ( e.expected is not None and e.partial is not None and e.expected == -e.partial ): arg = "Response may not contain content." else: arg = f"Connection broken: {e!r}" > raise ProtocolError(arg, e) from e E urllib3.exceptions.ProtocolError: ('Response may not contain content.', IncompleteRead(44 bytes read, -44 more expected)) /usr/lib/python3.13/site-packages/urllib3/response.py:778: ProtocolError During handling of the above exception, another exception occurred: self = @responses.activate def test_add_droplets(self): data = self.load_from_file('firewalls/droplets.json') url = self.base_url + "firewalls/12345/droplets" responses.add(responses.POST, url, body=data, status=204, content_type='application/json') droplet_id = json.loads(data)["droplet_ids"][0] > self.firewall.add_droplets([droplet_id]) digitalocean/tests/test_firewall.py:73: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ digitalocean/Firewall.py:202: in add_droplets return self.get_data( digitalocean/baseapi.py:216: in get_data req = self.__perform_request(url, type, params) digitalocean/baseapi.py:133: in __perform_request return requests_method(url, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:746: in send r.content /usr/lib/python3.13/site-packages/requests/models.py:902: in content self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b"" _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def generate(): # Special case for urllib3. if hasattr(self.raw, "stream"): try: yield from self.raw.stream(chunk_size, decode_content=True) except ProtocolError as e: > raise ChunkedEncodingError(e) E requests.exceptions.ChunkedEncodingError: ('Response may not contain content.', IncompleteRead(44 bytes read, -44 more expected)) /usr/lib/python3.13/site-packages/requests/models.py:822: ChunkedEncodingError __________________________ TestFirewall.test_add_tags __________________________ self = @contextmanager def _error_catcher(self) -> typing.Generator[None]: """ Catch low-level python exceptions, instead re-raising urllib3 variants, so that low-level exceptions are not leaked in the high-level api. On exit, release the connection back to the pool. """ clean_exit = False try: try: > yield /usr/lib/python3.13/site-packages/urllib3/response.py:754: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , amt = 10240 def _raw_read( self, amt: int | None = None, *, read1: bool = False, ) -> bytes: """ Reads `amt` of bytes from the socket. """ if self._fp is None: return None # type: ignore[return-value] fp_closed = getattr(self._fp, "closed", False) with self._error_catcher(): data = self._fp_read(amt, read1=read1) if not fp_closed else b"" if amt is not None and amt != 0 and not data: # Platform-specific: Buggy versions of Python. # Close the connection when no data is returned # # This is redundant to what httplib/http.client _should_ # already do. However, versions of python released before # December 15, 2012 (http://bugs.python.org/issue16298) do # not properly close the connection in all cases. There is # no harm in redundantly calling close. self._fp.close() if ( self.enforce_content_length and self.length_remaining is not None and self.length_remaining != 0 ): # This is an edge case that httplib failed to cover due # to concerns of backward compatibility. We're # addressing it here to make sure IncompleteRead is # raised during streaming, so all calls with incorrect # Content-Length are caught. > raise IncompleteRead(self._fp_bytes_read, self.length_remaining) E urllib3.exceptions.IncompleteRead: IncompleteRead(41 bytes read, -41 more expected) /usr/lib/python3.13/site-packages/urllib3/response.py:900: IncompleteRead The above exception was the direct cause of the following exception: def generate(): # Special case for urllib3. if hasattr(self.raw, "stream"): try: > yield from self.raw.stream(chunk_size, decode_content=True) /usr/lib/python3.13/site-packages/requests/models.py:820: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/response.py:1066: in stream data = self.read(amt=amt, decode_content=decode_content) /usr/lib/python3.13/site-packages/urllib3/response.py:983: in read data = self._raw_read(amt) /usr/lib/python3.13/site-packages/urllib3/response.py:878: in _raw_read with self._error_catcher(): /usr/lib/python3.13/contextlib.py:162: in __exit__ self.gen.throw(value) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = @contextmanager def _error_catcher(self) -> typing.Generator[None]: """ Catch low-level python exceptions, instead re-raising urllib3 variants, so that low-level exceptions are not leaked in the high-level api. On exit, release the connection back to the pool. """ clean_exit = False try: try: yield except SocketTimeout as e: # FIXME: Ideally we'd like to include the url in the ReadTimeoutError but # there is yet no clean way to get at it from this context. raise ReadTimeoutError(self._pool, None, "Read timed out.") from e # type: ignore[arg-type] except BaseSSLError as e: # FIXME: Is there a better way to differentiate between SSLErrors? if "read operation timed out" not in str(e): # SSL errors related to framing/MAC get wrapped and reraised here raise SSLError(e) from e raise ReadTimeoutError(self._pool, None, "Read timed out.") from e # type: ignore[arg-type] except IncompleteRead as e: if ( e.expected is not None and e.partial is not None and e.expected == -e.partial ): arg = "Response may not contain content." else: arg = f"Connection broken: {e!r}" > raise ProtocolError(arg, e) from e E urllib3.exceptions.ProtocolError: ('Response may not contain content.', IncompleteRead(41 bytes read, -41 more expected)) /usr/lib/python3.13/site-packages/urllib3/response.py:778: ProtocolError During handling of the above exception, another exception occurred: self = @responses.activate def test_add_tags(self): data = self.load_from_file('firewalls/tags.json') url = self.base_url + "firewalls/12345/tags" responses.add(responses.POST, url, body=data, status=204, content_type='application/json') tag = json.loads(data)["tags"][0] > self.firewall.add_tags([tag]) digitalocean/tests/test_firewall.py:104: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ digitalocean/Firewall.py:222: in add_tags return self.get_data( digitalocean/baseapi.py:216: in get_data req = self.__perform_request(url, type, params) digitalocean/baseapi.py:133: in __perform_request return requests_method(url, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:115: in post return request("post", url, data=data, json=json, **kwargs) /usr/lib/python3.13/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:746: in send r.content /usr/lib/python3.13/site-packages/requests/models.py:902: in content self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b"" _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def generate(): # Special case for urllib3. if hasattr(self.raw, "stream"): try: yield from self.raw.stream(chunk_size, decode_content=True) except ProtocolError as e: > raise ChunkedEncodingError(e) E requests.exceptions.ChunkedEncodingError: ('Response may not contain content.', IncompleteRead(41 bytes read, -41 more expected)) /usr/lib/python3.13/site-packages/requests/models.py:822: ChunkedEncodingError ______________________ TestFirewall.test_remove_droplets _______________________ self = @contextmanager def _error_catcher(self) -> typing.Generator[None]: """ Catch low-level python exceptions, instead re-raising urllib3 variants, so that low-level exceptions are not leaked in the high-level api. On exit, release the connection back to the pool. """ clean_exit = False try: try: > yield /usr/lib/python3.13/site-packages/urllib3/response.py:754: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , amt = 10240 def _raw_read( self, amt: int | None = None, *, read1: bool = False, ) -> bytes: """ Reads `amt` of bytes from the socket. """ if self._fp is None: return None # type: ignore[return-value] fp_closed = getattr(self._fp, "closed", False) with self._error_catcher(): data = self._fp_read(amt, read1=read1) if not fp_closed else b"" if amt is not None and amt != 0 and not data: # Platform-specific: Buggy versions of Python. # Close the connection when no data is returned # # This is redundant to what httplib/http.client _should_ # already do. However, versions of python released before # December 15, 2012 (http://bugs.python.org/issue16298) do # not properly close the connection in all cases. There is # no harm in redundantly calling close. self._fp.close() if ( self.enforce_content_length and self.length_remaining is not None and self.length_remaining != 0 ): # This is an edge case that httplib failed to cover due # to concerns of backward compatibility. We're # addressing it here to make sure IncompleteRead is # raised during streaming, so all calls with incorrect # Content-Length are caught. > raise IncompleteRead(self._fp_bytes_read, self.length_remaining) E urllib3.exceptions.IncompleteRead: IncompleteRead(44 bytes read, -44 more expected) /usr/lib/python3.13/site-packages/urllib3/response.py:900: IncompleteRead The above exception was the direct cause of the following exception: def generate(): # Special case for urllib3. if hasattr(self.raw, "stream"): try: > yield from self.raw.stream(chunk_size, decode_content=True) /usr/lib/python3.13/site-packages/requests/models.py:820: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/response.py:1066: in stream data = self.read(amt=amt, decode_content=decode_content) /usr/lib/python3.13/site-packages/urllib3/response.py:983: in read data = self._raw_read(amt) /usr/lib/python3.13/site-packages/urllib3/response.py:878: in _raw_read with self._error_catcher(): /usr/lib/python3.13/contextlib.py:162: in __exit__ self.gen.throw(value) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = @contextmanager def _error_catcher(self) -> typing.Generator[None]: """ Catch low-level python exceptions, instead re-raising urllib3 variants, so that low-level exceptions are not leaked in the high-level api. On exit, release the connection back to the pool. """ clean_exit = False try: try: yield except SocketTimeout as e: # FIXME: Ideally we'd like to include the url in the ReadTimeoutError but # there is yet no clean way to get at it from this context. raise ReadTimeoutError(self._pool, None, "Read timed out.") from e # type: ignore[arg-type] except BaseSSLError as e: # FIXME: Is there a better way to differentiate between SSLErrors? if "read operation timed out" not in str(e): # SSL errors related to framing/MAC get wrapped and reraised here raise SSLError(e) from e raise ReadTimeoutError(self._pool, None, "Read timed out.") from e # type: ignore[arg-type] except IncompleteRead as e: if ( e.expected is not None and e.partial is not None and e.expected == -e.partial ): arg = "Response may not contain content." else: arg = f"Connection broken: {e!r}" > raise ProtocolError(arg, e) from e E urllib3.exceptions.ProtocolError: ('Response may not contain content.', IncompleteRead(44 bytes read, -44 more expected)) /usr/lib/python3.13/site-packages/urllib3/response.py:778: ProtocolError During handling of the above exception, another exception occurred: self = @responses.activate def test_remove_droplets(self): data = self.load_from_file('firewalls/droplets.json') url = self.base_url + "firewalls/12345/droplets" responses.add(responses.DELETE, url, body=data, status=204, content_type='application/json') droplet_id = json.loads(data)["droplet_ids"][0] > self.firewall.remove_droplets([droplet_id]) digitalocean/tests/test_firewall.py:89: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ digitalocean/Firewall.py:212: in remove_droplets return self.get_data( digitalocean/baseapi.py:216: in get_data req = self.__perform_request(url, type, params) digitalocean/baseapi.py:133: in __perform_request return requests_method(url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:671: in delete return self.request("DELETE", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:746: in send r.content /usr/lib/python3.13/site-packages/requests/models.py:902: in content self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b"" _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def generate(): # Special case for urllib3. if hasattr(self.raw, "stream"): try: yield from self.raw.stream(chunk_size, decode_content=True) except ProtocolError as e: > raise ChunkedEncodingError(e) E requests.exceptions.ChunkedEncodingError: ('Response may not contain content.', IncompleteRead(44 bytes read, -44 more expected)) /usr/lib/python3.13/site-packages/requests/models.py:822: ChunkedEncodingError ________________________ TestFirewall.test_remove_tags _________________________ self = @contextmanager def _error_catcher(self) -> typing.Generator[None]: """ Catch low-level python exceptions, instead re-raising urllib3 variants, so that low-level exceptions are not leaked in the high-level api. On exit, release the connection back to the pool. """ clean_exit = False try: try: > yield /usr/lib/python3.13/site-packages/urllib3/response.py:754: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , amt = 10240 def _raw_read( self, amt: int | None = None, *, read1: bool = False, ) -> bytes: """ Reads `amt` of bytes from the socket. """ if self._fp is None: return None # type: ignore[return-value] fp_closed = getattr(self._fp, "closed", False) with self._error_catcher(): data = self._fp_read(amt, read1=read1) if not fp_closed else b"" if amt is not None and amt != 0 and not data: # Platform-specific: Buggy versions of Python. # Close the connection when no data is returned # # This is redundant to what httplib/http.client _should_ # already do. However, versions of python released before # December 15, 2012 (http://bugs.python.org/issue16298) do # not properly close the connection in all cases. There is # no harm in redundantly calling close. self._fp.close() if ( self.enforce_content_length and self.length_remaining is not None and self.length_remaining != 0 ): # This is an edge case that httplib failed to cover due # to concerns of backward compatibility. We're # addressing it here to make sure IncompleteRead is # raised during streaming, so all calls with incorrect # Content-Length are caught. > raise IncompleteRead(self._fp_bytes_read, self.length_remaining) E urllib3.exceptions.IncompleteRead: IncompleteRead(41 bytes read, -41 more expected) /usr/lib/python3.13/site-packages/urllib3/response.py:900: IncompleteRead The above exception was the direct cause of the following exception: def generate(): # Special case for urllib3. if hasattr(self.raw, "stream"): try: > yield from self.raw.stream(chunk_size, decode_content=True) /usr/lib/python3.13/site-packages/requests/models.py:820: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/site-packages/urllib3/response.py:1066: in stream data = self.read(amt=amt, decode_content=decode_content) /usr/lib/python3.13/site-packages/urllib3/response.py:983: in read data = self._raw_read(amt) /usr/lib/python3.13/site-packages/urllib3/response.py:878: in _raw_read with self._error_catcher(): /usr/lib/python3.13/contextlib.py:162: in __exit__ self.gen.throw(value) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = @contextmanager def _error_catcher(self) -> typing.Generator[None]: """ Catch low-level python exceptions, instead re-raising urllib3 variants, so that low-level exceptions are not leaked in the high-level api. On exit, release the connection back to the pool. """ clean_exit = False try: try: yield except SocketTimeout as e: # FIXME: Ideally we'd like to include the url in the ReadTimeoutError but # there is yet no clean way to get at it from this context. raise ReadTimeoutError(self._pool, None, "Read timed out.") from e # type: ignore[arg-type] except BaseSSLError as e: # FIXME: Is there a better way to differentiate between SSLErrors? if "read operation timed out" not in str(e): # SSL errors related to framing/MAC get wrapped and reraised here raise SSLError(e) from e raise ReadTimeoutError(self._pool, None, "Read timed out.") from e # type: ignore[arg-type] except IncompleteRead as e: if ( e.expected is not None and e.partial is not None and e.expected == -e.partial ): arg = "Response may not contain content." else: arg = f"Connection broken: {e!r}" > raise ProtocolError(arg, e) from e E urllib3.exceptions.ProtocolError: ('Response may not contain content.', IncompleteRead(41 bytes read, -41 more expected)) /usr/lib/python3.13/site-packages/urllib3/response.py:778: ProtocolError During handling of the above exception, another exception occurred: self = @responses.activate def test_remove_tags(self): data = self.load_from_file('firewalls/tags.json') url = self.base_url + "firewalls/12345/tags" responses.add(responses.DELETE, url, body=data, status=204, content_type='application/json') tag = json.loads(data)["tags"][0] > self.firewall.remove_tags([tag]) digitalocean/tests/test_firewall.py:119: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ digitalocean/Firewall.py:232: in remove_tags return self.get_data( digitalocean/baseapi.py:216: in get_data req = self.__perform_request(url, type, params) digitalocean/baseapi.py:133: in __perform_request return requests_method(url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:671: in delete return self.request("DELETE", url, **kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.13/site-packages/requests/sessions.py:746: in send r.content /usr/lib/python3.13/site-packages/requests/models.py:902: in content self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b"" _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def generate(): # Special case for urllib3. if hasattr(self.raw, "stream"): try: yield from self.raw.stream(chunk_size, decode_content=True) except ProtocolError as e: > raise ChunkedEncodingError(e) E requests.exceptions.ChunkedEncodingError: ('Response may not contain content.', IncompleteRead(41 bytes read, -41 more expected)) /usr/lib/python3.13/site-packages/requests/models.py:822: ChunkedEncodingError =============================== warnings summary =============================== digitalocean/tests/test_droplet.py::TestDroplet::test_get_kernel_available_with_pages digitalocean/tests/test_manager.py::TestManager::test_get_droplet_snapshots digitalocean/tests/test_manager.py::TestManager::test_get_per_region_volumes digitalocean/tests/test_manager.py::TestManager::test_get_volume_snapshots /usr/lib/python3.13/site-packages/responses/__init__.py:436: DeprecationWarning: Argument 'match_querystring' is deprecated. Use 'responses.matchers.query_param_matcher' or 'responses.matchers.query_string_matcher' warn( -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info ============================ FAILED digitalocean/tests/test_firewall.py::TestFirewall::test_add_droplets FAILED digitalocean/tests/test_firewall.py::TestFirewall::test_add_tags - req... FAILED digitalocean/tests/test_firewall.py::TestFirewall::test_remove_droplets FAILED digitalocean/tests/test_firewall.py::TestFirewall::test_remove_tags - ... ================== 4 failed, 150 passed, 4 warnings in 5.00s =================== ==> ERROR: A failure occurred in check().  Aborting... ==> ERROR: Build failed, check /var/lib/archbuild/extra-riscv64/felix-2/build [?25h[?25h[?25hreceiving incremental file list python-digitalocean-1.17.0-7-riscv64-build.log python-digitalocean-1.17.0-7-riscv64-check.log sent 62 bytes received 3,902 bytes 7,928.00 bytes/sec total size is 34,230 speedup is 8.64