==> Building on litleo ==> Checking for remote environment... ==> Syncing package to remote host... sending incremental file list ./ PKGBUILD 1,534 100% 0.00kB/s 0:00:00 1,534 100% 0.00kB/s 0:00:00 (xfr#1, to-chk=1/3) dns-lexicon-3.12.0-1.log 316 100% 308.59kB/s 0:00:00 316 100% 308.59kB/s 0:00:00 (xfr#2, to-chk=0/3) sent 1,081 bytes received 75 bytes 770.67 bytes/sec total size is 1,697 speedup is 1.47 ==> Running extra-riscv64-build -- -d /home/felix/packages/riscv64-pkg-cache:/var/cache/pacman/pkg -l root8 on remote host... [?25l:: Synchronizing package databases... core downloading... extra downloading... :: Starting full system upgrade... resolving dependencies... looking for conflicting packages... Package (4) Old Version New Version Net Change Download Size core/bison 3.8.2-5 3.8.2-6 0.01 MiB 0.75 MiB core/libp11-kit 0.24.1-1 0.25.0-1 0.40 MiB 0.52 MiB core/p11-kit 0.24.1-1 0.25.0-1 0.23 MiB 0.21 MiB core/zlib 1:1.2.13-2 1:1.2.13-3 0.06 MiB 0.15 MiB Total Download Size: 1.63 MiB Total Installed Size: 6.99 MiB Net Upgrade Size: 0.69 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... bison-3.8.2-6-riscv64 downloading... libp11-kit-0.25.0-1-riscv64 downloading... p11-kit-0.25.0-1-riscv64 downloading... zlib-1:1.2.13-3-riscv64 downloading... checking keyring... checking package integrity... loading package files... checking for file conflicts... :: Processing package changes... upgrading zlib... upgrading bison... upgrading libp11-kit... upgrading p11-kit... :: Running post-transaction hooks... (1/1) Updating the info directory file... [?25h==> Building in chroot for [extra] (riscv64)... ==> Synchronizing chroot copy [/var/lib/archbuild/extra-riscv64/root] -> [root8]...done ==> Making package: dns-lexicon 3.12.0-1 (Sun Jul 2 03:04:51 2023) ==> Retrieving sources...  -> Downloading dns-lexicon-3.12.0.tar.gz... % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 2654k 0 2654k 0 0 1522k 0 --:--:-- 0:00:01 --:--:-- 2633k 100 6572k 0 6572k 0 0 2767k 0 --:--:-- 0:00:02 --:--:-- 4007k ==> Validating source files with sha512sums... dns-lexicon-3.12.0.tar.gz ... Passed ==> Making package: dns-lexicon 3.12.0-1 (Sun Jul 2 03:05:10 2023) ==> Checking runtime dependencies... ==> Installing missing dependencies... [?25lresolving dependencies... looking for conflicting packages... Package (20) New Version Net Change Download Size core/libnsl 2.0.0-3 0.06 MiB extra/libyaml 0.2.5-2 0.15 MiB core/python 3.11.3-1 104.68 MiB extra/python-cffi 1.15.1-3 1.39 MiB extra/python-chardet 5.1.0-3 3.02 MiB extra/python-filelock 3.12.0-2 0.07 MiB extra/python-idna 3.4-3 0.71 MiB extra/python-ply 3.11-12 0.40 MiB extra/python-pycparser 2.21-5 1.77 MiB extra/python-requests-file 1.5.1-6 0.02 MiB 0.01 MiB extra/python-six 1.16.0-8 0.12 MiB extra/python-soupsieve 2.4.1-1 0.46 MiB 0.08 MiB extra/python-urllib3 1.26.15-1 1.30 MiB extra/python-zipp 3.15.0-2 0.06 MiB extra/python-beautifulsoup4 4.12.2-1 1.68 MiB 0.27 MiB extra/python-cryptography 41.0.1-2 54.95 MiB extra/python-importlib-metadata 5.0.0-5 0.20 MiB extra/python-requests 2.28.2-4 0.61 MiB extra/python-tldextract 3.4.4-1 0.35 MiB 0.10 MiB extra/python-yaml 6.0-3 0.93 MiB Total Download Size: 0.46 MiB Total Installed Size: 172.93 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... python-beautifulsoup4-4.12.2-1-any downloading... python-tldextract-3.4.4-1-any downloading... python-soupsieve-2.4.1-1-any downloading... python-requests-file-1.5.1-6-any downloading... checking keyring... checking package integrity... loading package files... checking for file conflicts... :: Processing package changes... installing libnsl... installing python... Optional dependencies for python python-setuptools python-pip sqlite [installed] mpdecimal: for decimal xz: for lzma [installed] tk: for tkinter installing python-soupsieve... installing python-beautifulsoup4... Optional dependencies for python-beautifulsoup4 python-chardet: to autodetect character encodings [pending] python-lxml: alternative HTML parser python-html5lib: alternative HTML parser installing python-ply... installing python-pycparser... installing python-cffi... installing python-cryptography... installing libyaml... installing python-yaml... installing python-urllib3... Optional dependencies for python-urllib3 python-brotli: Brotli support python-certifi: security support python-cryptography: security support [installed] python-idna: security support [pending] python-pyopenssl: security support python-pysocks: SOCKS support installing python-chardet... installing python-idna... installing python-requests... Optional dependencies for python-requests python-pysocks: SOCKS proxy support installing python-six... installing python-requests-file... installing python-filelock... installing python-tldextract... installing python-zipp... installing python-importlib-metadata... [?25h==> Checking buildtime dependencies... ==> Installing missing dependencies... [?25lresolving dependencies... looking for conflicting packages... Package (50) New Version Net Change Download Size core/dnssec-anchors 20190629-3 0.00 MiB extra/jemalloc 1:5.3.0-2 4.84 MiB core/libedit 20221030_3.1-1 0.24 MiB extra/libmaxminddb 1.7.1-1 0.04 MiB extra/libuv 1.46.0-1 0.55 MiB 0.24 MiB extra/libxslt 1.1.37-3 0.70 MiB extra/lmdb 0.9.29-1 0.39 MiB extra/python-attrs 22.2.0-3 0.55 MiB extra/python-botocore 1.29.162-1 84.76 MiB 5.75 MiB extra/python-certifi 2023.05.07-1 0.02 MiB extra/python-click 8.1.3-3 1.22 MiB extra/python-colorama 0.4.6-2 0.27 MiB extra/python-dateutil 2.8.2-5 1.05 MiB extra/python-iniconfig 2.0.0-4 0.04 MiB extra/python-isodate 0.6.1-3 0.35 MiB 0.06 MiB extra/python-jmespath 1.0.1-2 0.23 MiB 0.04 MiB extra/python-jsonschema 4.17.3-3 1.33 MiB extra/python-lark-parser 1.1.5-4 1.29 MiB extra/python-lxml 4.9.2-3 4.39 MiB extra/python-markdown-it-py 2.2.0-3 0.67 MiB extra/python-mdurl 0.1.2-4 0.06 MiB extra/python-multidict 6.0.4-2 0.16 MiB extra/python-packaging 23.1-1 0.47 MiB extra/python-platformdirs 3.5.1-1 0.20 MiB extra/python-pluggy 1.0.0-4 0.13 MiB extra/python-prettytable 3.6.0-3 0.35 MiB 0.06 MiB extra/python-prompt_toolkit 3.0.38-2 4.65 MiB 0.71 MiB extra/python-pygments 2.15.1-1 13.57 MiB extra/python-pyproject-hooks 1.0.0-5 0.09 MiB extra/python-pyrsistent 0.19.3-3 0.63 MiB extra/python-pytz 2023.3-1 0.17 MiB extra/python-requests-toolbelt 1.0.0-1 0.46 MiB 0.08 MiB extra/python-rich 13.4.2-1 3.28 MiB 0.52 MiB extra/python-s3transfer 0.6.1-1 0.93 MiB 0.13 MiB extra/python-typing_extensions 4.7.0-1 0.37 MiB extra/python-uc-micro-py 1.0.2-1 0.02 MiB extra/python-wcwidth 0.2.6-1 0.52 MiB extra/python-wrapt 1.14.1-2 0.24 MiB 0.05 MiB extra/python-yarl 1.9.2-1 0.27 MiB 0.08 MiB extra/bind 9.18.16-1 11.11 MiB 2.09 MiB extra/python-boto3 1.26.162-1 1.55 MiB 0.15 MiB extra/python-build 0.10.0-4 0.68 MiB extra/python-dnspython 1:2.3.0-3 3.15 MiB extra/python-installer 0.7.0-3 0.82 MiB extra/python-localzone 0.9.8-4 0.07 MiB 0.02 MiB extra/python-poetry-core 1.6.1-1 5.17 MiB extra/python-pytest 7.4.0-1 4.01 MiB 0.68 MiB extra/python-softlayer 6.1.4-2 5.36 MiB 0.76 MiB extra/python-vcrpy 4.2.1-3 0.45 MiB 0.09 MiB extra/python-zeep 4.2.1-2 1.29 MiB 0.21 MiB Total Download Size: 11.70 MiB Total Installed Size: 163.13 MiB :: Proceed with installation? [Y/n] :: Retrieving packages... python-botocore-1.29.162-1-any downloading... bind-9.18.16-1-riscv64 downloading... python-softlayer-6.1.4-2-any downloading... python-prompt_toolkit-3.0.38-2-any downloading... python-pytest-7.4.0-1-any downloading... python-rich-13.4.2-1-any downloading... libuv-1.46.0-1-riscv64 downloading... python-zeep-4.2.1-2-any downloading... python-boto3-1.26.162-1-any downloading... python-s3transfer-0.6.1-1-any downloading... python-vcrpy-4.2.1-3-any downloading... python-requests-toolbelt-1.0.0-1-any downloading... python-yarl-1.9.2-1-riscv64 downloading... python-prettytable-3.6.0-3-any downloading... python-isodate-0.6.1-3-any downloading... python-wrapt-1.14.1-2-riscv64 downloading... python-jmespath-1.0.1-2-any downloading... python-localzone-0.9.8-4-any downloading... checking keyring... checking package integrity... loading package files... checking for file conflicts... :: Processing package changes... installing python-packaging... installing python-pyproject-hooks... installing python-build... Optional dependencies for python-build python-virtualenv: Use virtualenv for build isolation installing python-installer... installing python-attrs... installing python-pyrsistent... installing python-typing_extensions... installing python-jsonschema... Optional dependencies for python-jsonschema python-isoduration: for duration format python-fqdn: for hostname format python-idna: for idn-hostname format [installed] python-jsonpointer: for json-pointer & relative-json-pointer format python-rfc3339-validator: for date-time format python-rfc3987: for iri, iri-reference, uri & uri-reference format python-uri-template: for uri-template format python-webcolors: for color format installing python-lark-parser... Optional dependencies for python-lark-parser python-atomicwrites: for atomic_cache python-regex: for regex support python-js2py: for nearley support installing python-poetry-core... installing python-iniconfig... installing python-pluggy... installing python-pytest... installing python-wrapt... installing python-multidict... installing python-yarl... installing python-vcrpy... installing python-certifi... installing python-dateutil... installing python-jmespath... installing python-botocore... Optional dependencies for python-botocore python-awscrt installing python-s3transfer... Optional dependencies for python-s3transfer python-awscrt installing python-boto3... installing python-dnspython... Optional dependencies for python-dnspython python-cryptography: DNSSEC support [installed] python-requests-toolbelt: DoH support [pending] python-idna: support for updated IDNA 2008 [installed] python-curio: async support python-trio: async support python-sniffio: async support installing python-localzone... installing python-wcwidth... installing python-prettytable... installing python-click... installing python-pygments... installing python-prompt_toolkit... installing python-colorama... installing python-uc-micro-py... installing python-mdurl... installing python-markdown-it-py... Optional dependencies for python-markdown-it-py python-mdit_py_plugins: core plugins installing python-rich... installing python-softlayer... installing python-isodate... installing libxslt... Optional dependencies for libxslt python: Python bindings [installed] installing python-lxml... Optional dependencies for python-lxml python-beautifulsoup4: support for beautifulsoup parser to parse not well formed HTML [installed] python-cssselect: support for cssselect python-html5lib: support for html5lib parser python-lxml-docs: offline docs installing python-platformdirs... installing python-requests-toolbelt... installing python-pytz... installing python-zeep... installing dnssec-anchors... installing libedit... installing libmaxminddb... Optional dependencies for libmaxminddb geoip2-database: IP geolocation databases installing libuv... installing lmdb... installing jemalloc... Optional dependencies for jemalloc perl: for jeprof [installed] installing bind... [?25h==> Retrieving sources...  -> Found dns-lexicon-3.12.0.tar.gz ==> WARNING: Skipping all source file integrity checks. ==> Extracting sources...  -> Extracting dns-lexicon-3.12.0.tar.gz with bsdtar ==> Starting build()... * Getting build dependencies for wheel... * Building wheel... Successfully built dns_lexicon-3.12.0-py3-none-any.whl ==> Starting check()... ============================= test session starts ============================== platform linux -- Python 3.11.3, pytest-7.4.0, pluggy-1.0.0 rootdir: /build/dns-lexicon/src/lexicon-3.12.0 configfile: pyproject.toml collected 2445 items / 58 deselected / 2387 selected lexicon/tests/test_client.py ............ [ 0%] lexicon/tests/test_config.py ......... [ 0%] lexicon/tests/test_library.py ....... [ 1%] lexicon/tests/test_output.py ..... [ 1%] lexicon/tests/test_parser.py ..... [ 1%] lexicon/tests/providers/test_aliyun.py ..................ss......... [ 2%] lexicon/tests/providers/test_aurora.py ..................ss......... [ 4%] lexicon/tests/providers/test_auto.py sF..FFFFFFFFFFFFFFFssFFFFFFFFF [ 5%] lexicon/tests/providers/test_azure.py ..................ss......... [ 6%] lexicon/tests/providers/test_cloudflare.py ..................ss......... [ 7%] [ 7%] lexicon/tests/providers/test_cloudns.py ..................ss......... [ 8%] lexicon/tests/providers/test_cloudxns.py ..................ss......s.. [ 10%] lexicon/tests/providers/test_conoha.py ..................ss......... [ 11%] lexicon/tests/providers/test_constellix.py ..................ss......... [ 12%] [ 12%] lexicon/tests/providers/test_ddns.py sssssssssssssssssssssssssssss [ 13%] lexicon/tests/providers/test_digitalocean.py ..................ss......s [ 14%] .. [ 14%] lexicon/tests/providers/test_dinahosting.py ................s.ss........ [ 16%] . [ 16%] lexicon/tests/providers/test_directadmin.py ..................ss........ [ 17%] . [ 17%] lexicon/tests/providers/test_dnsimple.py ..................ss......... [ 18%] lexicon/tests/providers/test_dnsmadeeasy.py ..................ss......s. [ 19%] . [ 19%] lexicon/tests/providers/test_dnspark.py ............sss.....s.. [ 20%] lexicon/tests/providers/test_dnspod.py ............sss.....s.. [ 21%] lexicon/tests/providers/test_dnsservices.py ..................ss........ [ 22%] . [ 22%] lexicon/tests/providers/test_dreamhost.py ...................s.ss....... [ 24%] .. [ 24%] lexicon/tests/providers/test_duckdns.py .ss..s...s.....ssssss.....s.ss [ 25%] lexicon/tests/providers/test_dynu.py ..................ss......... [ 26%] lexicon/tests/providers/test_easydns.py .............ss.....s.. [ 27%] lexicon/tests/providers/test_easyname.py ..................ss......... [ 28%] lexicon/tests/providers/test_euserv.py ..................ss......... [ 30%] lexicon/tests/providers/test_exoscale.py ..................ss......... [ 31%] lexicon/tests/providers/test_flexibleengine.py ..............s..sssss.s. [ 32%] .... [ 32%] lexicon/tests/providers/test_gandi.py ..................ss.............. [ 34%] .............ss......... [ 35%] lexicon/tests/providers/test_gehirn.py .............ss........ [ 36%] lexicon/tests/providers/test_glesys.py .............ss.....s.. [ 36%] lexicon/tests/providers/test_godaddy.py ..................ss......... [ 38%] lexicon/tests/providers/test_googleclouddns.py ..................ss..... [ 39%] .... [ 39%] lexicon/tests/providers/test_gransy.py ..................ss......... [ 40%] lexicon/tests/providers/test_gratisdns.py ..................ss......... [ 41%] lexicon/tests/providers/test_henet.py ..................ss......... [ 43%] lexicon/tests/providers/test_hetzner.py ..................ss......... [ 44%] lexicon/tests/providers/test_hostingde.py ..................ss......... [ 45%] lexicon/tests/providers/test_hover.py ..................ss......... [ 46%] lexicon/tests/providers/test_infoblox.py ..................ss......... [ 47%] lexicon/tests/providers/test_infomaniak.py ..................ss......... [ 49%] [ 49%] lexicon/tests/providers/test_internetbs.py ..................ss......... [ 50%] [ 50%] lexicon/tests/providers/test_inwx.py ..................ss......... [ 51%] lexicon/tests/providers/test_joker.py ..................ss......... [ 52%] lexicon/tests/providers/test_linode.py ................s.ss......... [ 54%] lexicon/tests/providers/test_linode4.py ................s.ss......... [ 55%] lexicon/tests/providers/test_localzone.py ss................ss......... [ 56%] lexicon/tests/providers/test_luadns.py ....s.............ss......s.. [ 57%] lexicon/tests/providers/test_memset.py ..................ss......s.. [ 58%] lexicon/tests/providers/test_misaka.py ..................ss......... [ 60%] lexicon/tests/providers/test_mythicbeasts.py ..................ss....... [ 61%] .. [ 61%] lexicon/tests/providers/test_namecheap.py ...FFFFFFFFFFFFFsFssFFFFFFFFF. [ 62%] ..FFFFFFFFFFFFFsFssFFFFFFFFF [ 63%] lexicon/tests/providers/test_namecom.py ........................ss...... [ 65%] ......... [ 65%] lexicon/tests/providers/test_namesilo.py ..................ss......s.. [ 66%] lexicon/tests/providers/test_netcup.py ................s.ss......... [ 67%] lexicon/tests/providers/test_nfsn.py ..................ss......... [ 69%] lexicon/tests/providers/test_njalla.py .........s.....s..ss......... [ 70%] lexicon/tests/providers/test_nsone.py ................s.ss......s.. [ 71%] lexicon/tests/providers/test_onapp.py ..................ss......... [ 72%] lexicon/tests/providers/test_online.py .............s....ss......... [ 73%] lexicon/tests/providers/test_ovh.py ..................ss......... [ 75%] lexicon/tests/providers/test_plesk.py ................s.ss......... [ 76%] lexicon/tests/providers/test_pointhq.py ................s.ss......s.. [ 77%] lexicon/tests/providers/test_porkbun.py ..................ss......... [ 78%] lexicon/tests/providers/test_powerdns.py ..................ss......s.. [ 80%] lexicon/tests/providers/test_rackspace.py ..................ss......... [ 81%] lexicon/tests/providers/test_rage4.py ..................ss.....ss.s [ 82%] lexicon/tests/providers/test_rcodezero.py ..................ss......... [ 83%] lexicon/tests/providers/test_route53.py ....................ss......... [ 84%] lexicon/tests/providers/test_safedns.py ....s...........s.ss......... [ 86%] lexicon/tests/providers/test_sakuracloud.py ...........s.ss........ [ 87%] lexicon/tests/providers/test_softlayer.py .............ss........ [ 88%] lexicon/tests/providers/test_transip.py ..................ss......... [ 89%] lexicon/tests/providers/test_ultradns.py ..................ss......... [ 90%] lexicon/tests/providers/test_valuedomain.py ..................ss........ [ 91%] . [ 91%] lexicon/tests/providers/test_vercel.py ..................ss......... [ 92%] lexicon/tests/providers/test_vultr.py ..................ss......... [ 94%] lexicon/tests/providers/test_webgo.py ..................ss......... [ 95%] lexicon/tests/providers/test_yandex.py ..................ss......... [ 96%] lexicon/tests/providers/test_yandexcloud.py ..................ss........ [ 97%] . [ 97%] lexicon/tests/providers/test_zilore.py ..................ss......... [ 99%] lexicon/tests/providers/test_zonomi.py .............ss.....s.. [100%] =================================== FAILURES =================================== _________________ AutoProviderTests.test_provider_authenticate _________________ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_authenticate(self): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:121: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_authenticate.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_for_A_with_valid_name_and_content( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:141: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_create_record_for_A_with_valid_name_and_content.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:171: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_for_TXT_with_full_name_and_content( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_create_record_for_TXT_with_full_name_and_content.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:155: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_multiple_times_should_create_record_set( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:504: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_create_record_multiple_times_should_create_record_set.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610750 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_with_duplicate_records_should_be_noop( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:490: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_create_record_with_duplicate_records_should_be_noop.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 5 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610752 E E 5 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610752 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_by_filter_should_remove_record(self): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:318: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_delete_record_by_filter_should_remove_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610753 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:342: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610754 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:328: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610755 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_by_identifier_should_remove_record( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:309: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_delete_record_by_identifier_should_remove_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610756 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:556: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 7 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610757 E E 5 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610757 E E 6 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610758 E E 7 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610758 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_with_record_set_name_remove_all(self): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:536: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_delete_record_with_record_set_name_remove_all.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 6 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610760 E E 5 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610760 E E 6 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610762 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_list_records_after_setting_ttl __ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_after_setting_ttl(self): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:226: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_list_records_after_setting_ttl.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610763 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_should_handle_record_sets(self): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:522: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_list_records_should_handle_record_sets.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 6 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610766 E E 5 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610766 E E 6 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610768 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:214: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610770 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_with_full_name_filter_should_return_record( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:200: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_list_records_with_full_name_filter_should_return_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610771 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:516: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_with_name_filter_should_return_record( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:188: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_list_records_with_name_filter_should_return_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610773 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_with_no_arguments_should_list_all(self): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:181: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_list_records_with_no_arguments_should_list_all.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 27 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record E E 5 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1553644693 E E 6 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1553644694 E E 7 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1553644697 E E 8 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1553644695 E E 9 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610742 E E 10 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610744 E E 11 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1553644696 E E 12 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1553644690 E E 13 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610770 E E 14 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610771 E E 15 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610773 E E 16 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610763 E E 17 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1553644691 E E 18 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1553644692 E E 19 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610750 E E 20 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610751 E E 21 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610758 E E 22 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610745 E E 23 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610747 E E 24 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610766 E E 25 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610768 E E 26 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610752 E E 27 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610749 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_update_record_should_modify_record(self): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_update_record_should_modify_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610777 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_update_record_should_modify_record_name_specified( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_update_record_should_modify_record_name_specified.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610778 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_update_record_with_fqdn_name_should_modify_record( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:290: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_update_record_with_fqdn_name_should_modify_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610779 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ AutoProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_update_record_with_full_name_should_modify_record( self, ): > provider = self._construct_authenticated_provider() lexicon/tests/providers/integration_tests.py:274: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/tests/providers/integration_tests.py:433: in _construct_authenticated_provider provider.authenticate() lexicon/providers/auto.py:203: in authenticate (provider_name, provider_module) = _relevant_provider_for_domain( lexicon/providers/auto.py:72: in _relevant_provider_for_domain nameserver_domains = _get_ns_records_domains_for_domain(domain) lexicon/providers/auto.py:39: in _get_ns_records_domains_for_domain tlds = [ lexicon/providers/auto.py:40: in tldextract.extract(ns_entry) for ns_entry in _get_ns_records_for_domain(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/auto/IntegrationTests/test_provider_when_calling_update_record_with_full_name_should_modify_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 2 different matcher(s) : E E 1 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/auth/time E E 2 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/ E E 3 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/status E E 4 - (). E Matchers succeeded : ['method', 'scheme', 'port', 'query'] E Matchers failed : E host - assertion failure : E publicsuffix.org != eu.api.ovh.com E path - assertion failure : E /list/public_suffix_list.dat != /1.0/domain/zone/pacalis.net/record/1570610784 /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_for_A_with_valid_name_and_content( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record("A", "localhost", "127.0.0.1") lexicon/tests/providers/integration_tests.py:142: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_create_record_for_A_with_valid_name_and_content.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record("CNAME", "docs", "docs.example.com") lexicon/tests/providers/integration_tests.py:149: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"_acme-challenge.fqdn.{self.domain}.", "challengetoken" ) lexicon/tests/providers/integration_tests.py:172: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_for_TXT_with_full_name_and_content( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"_acme-challenge.full.{self.domain}", "challengetoken" ) lexicon/tests/providers/integration_tests.py:163: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_create_record_for_TXT_with_full_name_and_content.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record("TXT", "_acme-challenge.test", "challengetoken") lexicon/tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_multiple_times_should_create_record_set( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"_acme-challenge.createrecordset.{self.domain}.", "challengetoken1" ) lexicon/tests/providers/integration_tests.py:505: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_create_record_multiple_times_should_create_record_set.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 5 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_with_duplicate_records_should_be_noop( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"_acme-challenge.noop.{self.domain}.", "challengetoken" ) lexicon/tests/providers/integration_tests.py:491: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_create_record_with_duplicate_records_should_be_noop.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 6 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_by_filter_should_remove_record(self): provider = self._construct_authenticated_provider() > assert provider.create_record("TXT", "delete.testfilt", "challengetoken") lexicon/tests/providers/integration_tests.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_delete_record_by_filter_should_remove_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 7 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"delete.testfqdn.{self.domain}.", "challengetoken" ) lexicon/tests/providers/integration_tests.py:343: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 7 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"delete.testfull.{self.domain}", "challengetoken" ) lexicon/tests/providers/integration_tests.py:329: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 7 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_by_identifier_should_remove_record( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record("TXT", "delete.testid", "challengetoken") lexicon/tests/providers/integration_tests.py:310: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_delete_record_by_identifier_should_remove_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 8 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 8 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"_acme-challenge.deleterecordinset.{self.domain}.", "challengetoken1", ) lexicon/tests/providers/integration_tests.py:557: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 9 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 8 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 9 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_with_record_set_name_remove_all(self): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"_acme-challenge.deleterecordset.{self.domain}.", "challengetoken1" ) lexicon/tests/providers/integration_tests.py:537: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_delete_record_with_record_set_name_remove_all.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 11 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 8 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 9 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 10 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 11 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_should_handle_record_sets(self): provider = self._construct_authenticated_provider() > provider.create_record( "TXT", f"_acme-challenge.listrecordset.{self.domain}.", "challengetoken1" ) lexicon/tests/providers/integration_tests.py:523: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_list_records_should_handle_record_sets.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 6 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record( self, ): provider = self._construct_authenticated_provider() > provider.create_record( "TXT", f"random.fqdntest.{self.domain}.", "challengetoken" ) lexicon/tests/providers/integration_tests.py:215: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_with_full_name_filter_should_return_record( self, ): provider = self._construct_authenticated_provider() > provider.create_record( "TXT", f"random.fulltest.{self.domain}", "challengetoken" ) lexicon/tests/providers/integration_tests.py:201: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_list_records_with_full_name_filter_should_return_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list( self, ): provider = self._construct_authenticated_provider() > records = provider.list_records("TXT", f"filter.thisdoesnotexist.{self.domain}") lexicon/tests/providers/integration_tests.py:517: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:92: in list_records return self._list_records(rtype=rtype, name=name, content=content) lexicon/providers/namecheap.py:215: in _list_records return self._list_records_internal(rtype=rtype, name=name, content=content) lexicon/providers/namecheap.py:221: in _list_records_internal raw_records = self.client.domains_dns_get_hosts(self.domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 2 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_with_name_filter_should_return_record( self, ): provider = self._construct_authenticated_provider() > provider.create_record("TXT", "random.test", "challengetoken") lexicon/tests/providers/integration_tests.py:189: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_list_records_with_name_filter_should_return_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_with_no_arguments_should_list_all(self): provider = self._construct_authenticated_provider() > assert isinstance(provider.list_records(), list) lexicon/tests/providers/integration_tests.py:182: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:92: in list_records return self._list_records(rtype=rtype, name=name, content=content) lexicon/providers/namecheap.py:215: in _list_records return self._list_records_internal(rtype=rtype, name=name, content=content) lexicon/providers/namecheap.py:221: in _list_records_internal raw_records = self.client.domains_dns_get_hosts(self.domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_list_records_with_no_arguments_should_list_all.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 2 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_update_record_should_modify_record(self): provider = self._construct_authenticated_provider() > assert provider.create_record("TXT", "orig.test", "challengetoken") lexicon/tests/providers/integration_tests.py:256: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_update_record_should_modify_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 7 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_update_record_should_modify_record_name_specified( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record("TXT", "orig.nameonly.test", "challengetoken") lexicon/tests/providers/integration_tests.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_update_record_should_modify_record_name_specified.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 8 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 8 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_update_record_with_fqdn_name_should_modify_record( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"orig.testfqdn.{self.domain}.", "challengetoken" ) lexicon/tests/providers/integration_tests.py:291: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_update_record_with_fqdn_name_should_modify_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 7 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_update_record_with_full_name_should_modify_record( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"orig.testfull.{self.domain}", "challengetoken" ) lexicon/tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/IntegrationTests/test_provider_when_calling_update_record_with_full_name_should_modify_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 7 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest2.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest2'), ('TLD', 'dev')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_create_record_for_A_with_valid_name_and_content _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_for_A_with_valid_name_and_content( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record("A", "localhost", "127.0.0.1") lexicon/tests/providers/integration_tests.py:142: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_create_record_for_A_with_valid_name_and_content.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record("CNAME", "docs", "docs.example.com") lexicon/tests/providers/integration_tests.py:149: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"_acme-challenge.fqdn.{self.domain}.", "challengetoken" ) lexicon/tests/providers/integration_tests.py:172: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_create_record_for_TXT_with_full_name_and_content _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_for_TXT_with_full_name_and_content( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"_acme-challenge.full.{self.domain}", "challengetoken" ) lexicon/tests/providers/integration_tests.py:163: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_create_record_for_TXT_with_full_name_and_content.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record("TXT", "_acme-challenge.test", "challengetoken") lexicon/tests/providers/integration_tests.py:156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 3 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_create_record_multiple_times_should_create_record_set _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_multiple_times_should_create_record_set( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"_acme-challenge.createrecordset.{self.domain}.", "challengetoken1" ) lexicon/tests/providers/integration_tests.py:505: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_create_record_multiple_times_should_create_record_set.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 5 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_create_record_with_duplicate_records_should_be_noop _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_create_record_with_duplicate_records_should_be_noop( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"_acme-challenge.noop.{self.domain}.", "challengetoken" ) lexicon/tests/providers/integration_tests.py:491: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_create_record_with_duplicate_records_should_be_noop.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 6 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_delete_record_by_filter_should_remove_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_by_filter_should_remove_record(self): provider = self._construct_authenticated_provider() > assert provider.create_record("TXT", "delete.testfilt", "challengetoken") lexicon/tests/providers/integration_tests.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_delete_record_by_filter_should_remove_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 7 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"delete.testfqdn.{self.domain}.", "challengetoken" ) lexicon/tests/providers/integration_tests.py:343: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 7 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"delete.testfull.{self.domain}", "challengetoken" ) lexicon/tests/providers/integration_tests.py:329: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 7 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_delete_record_by_identifier_should_remove_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_by_identifier_should_remove_record( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record("TXT", "delete.testid", "challengetoken") lexicon/tests/providers/integration_tests.py:310: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_delete_record_by_identifier_should_remove_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 8 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 8 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"_acme-challenge.deleterecordinset.{self.domain}.", "challengetoken1", ) lexicon/tests/providers/integration_tests.py:557: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 9 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 8 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 9 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_delete_record_with_record_set_name_remove_all _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_delete_record_with_record_set_name_remove_all(self): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"_acme-challenge.deleterecordset.{self.domain}.", "challengetoken1" ) lexicon/tests/providers/integration_tests.py:537: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_delete_record_with_record_set_name_remove_all.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 11 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 8 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 9 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 10 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 11 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_list_records_should_handle_record_sets _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_should_handle_record_sets(self): provider = self._construct_authenticated_provider() > provider.create_record( "TXT", f"_acme-challenge.listrecordset.{self.domain}.", "challengetoken1" ) lexicon/tests/providers/integration_tests.py:523: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_list_records_should_handle_record_sets.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 6 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record( self, ): provider = self._construct_authenticated_provider() > provider.create_record( "TXT", f"random.fqdntest.{self.domain}.", "challengetoken" ) lexicon/tests/providers/integration_tests.py:215: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_list_records_with_full_name_filter_should_return_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_with_full_name_filter_should_return_record( self, ): provider = self._construct_authenticated_provider() > provider.create_record( "TXT", f"random.fulltest.{self.domain}", "challengetoken" ) lexicon/tests/providers/integration_tests.py:201: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_list_records_with_full_name_filter_should_return_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list( self, ): provider = self._construct_authenticated_provider() > records = provider.list_records("TXT", f"filter.thisdoesnotexist.{self.domain}") lexicon/tests/providers/integration_tests.py:517: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:92: in list_records return self._list_records(rtype=rtype, name=name, content=content) lexicon/providers/namecheap.py:215: in _list_records return self._list_records_internal(rtype=rtype, name=name, content=content) lexicon/providers/namecheap.py:221: in _list_records_internal raw_records = self.client.domains_dns_get_hosts(self.domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 2 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_list_records_with_name_filter_should_return_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_with_name_filter_should_return_record( self, ): provider = self._construct_authenticated_provider() > provider.create_record("TXT", "random.test", "challengetoken") lexicon/tests/providers/integration_tests.py:189: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_list_records_with_name_filter_should_return_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 4 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_list_records_with_no_arguments_should_list_all _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_list_records_with_no_arguments_should_list_all(self): provider = self._construct_authenticated_provider() > assert isinstance(provider.list_records(), list) lexicon/tests/providers/integration_tests.py:182: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:92: in list_records return self._list_records(rtype=rtype, name=name, content=content) lexicon/providers/namecheap.py:215: in _list_records return self._list_records_internal(rtype=rtype, name=name, content=content) lexicon/providers/namecheap.py:221: in _list_records_internal raw_records = self.client.domains_dns_get_hosts(self.domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_list_records_with_no_arguments_should_list_all.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 2 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_update_record_should_modify_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_update_record_should_modify_record(self): provider = self._construct_authenticated_provider() > assert provider.create_record("TXT", "orig.test", "challengetoken") lexicon/tests/providers/integration_tests.py:256: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_update_record_should_modify_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 7 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_update_record_should_modify_record_name_specified _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_update_record_should_modify_record_name_specified( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record("TXT", "orig.nameonly.test", "challengetoken") lexicon/tests/providers/integration_tests.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_update_record_should_modify_record_name_specified.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 6 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_update_record_with_fqdn_name_should_modify_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_update_record_with_fqdn_name_should_modify_record( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"orig.testfqdn.{self.domain}.", "challengetoken" ) lexicon/tests/providers/integration_tests.py:291: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_update_record_with_fqdn_name_should_modify_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 7 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException _ NamecheapManagedProviderTests.test_provider_when_calling_update_record_with_full_name_should_modify_record _ self = func = namespace = 'publicsuffix.org-tlds' kwargs = {'cache': , 'cache_fetch_timeout': None, 'fallback_to_snapshot': Tr...org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} hashed_argnames = ['urls', 'fallback_to_snapshot'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = namespace = 'publicsuffix.org-tlds' key = {'fallback_to_snapshot': True, 'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat')} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: publicsuffix.org-tlds key: {'urls': ('https://publicsuffix.org/list/public_suffix_list.dat', 'https://raw.githubusercontent.com/publicsuffix/list/master/public_suffix_list.dat'), 'fallback_to_snapshot': True}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = func = , namespace = 'urls' kwargs = {'session': , 'timeout': None, 'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} hashed_argnames = ['url'] def run_and_cache( self, func: Callable[..., T], namespace: str, kwargs: Dict[str, Hashable], hashed_argnames: Iterable[str], ) -> T: """Get a url but cache the response.""" if not self.enabled: return func(**kwargs) key_args = {k: v for k, v in kwargs.items() if k in hashed_argnames} cache_filepath = self._key_to_cachefile_path(namespace, key_args) lock_path = cache_filepath + ".lock" try: _make_dir(cache_filepath) except OSError as ioe: global _DID_LOG_UNABLE_TO_CACHE # pylint: disable=global-statement if not _DID_LOG_UNABLE_TO_CACHE: LOG.warning( "unable to cache %s.%s in %s. This could refresh the " "Public Suffix List over HTTP every app startup. " "Construct your `TLDExtract` with a writable `cache_dir` or " "set `cache_dir=None` to silence this warning. %s", namespace, key_args, cache_filepath, ioe, ) _DID_LOG_UNABLE_TO_CACHE = True return func(**kwargs) # Disable lint of 3rd party (see also https://github.com/tox-dev/py-filelock/issues/102) # pylint: disable-next=abstract-class-instantiated with FileLock(lock_path, timeout=self.lock_timeout): try: > result = cast(T, self.get(namespace=namespace, key=key_args)) /usr/lib/python3.11/site-packages/tldextract/cache.py:208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , namespace = 'urls' key = {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'} def get(self, namespace: str, key: Union[str, Dict[str, Hashable]]) -> object: """Retrieve a value from the disk cache""" if not self.enabled: raise KeyError("Cache is disabled") cache_filepath = self._key_to_cachefile_path(namespace, key) if not os.path.isfile(cache_filepath): > raise KeyError("namespace: " + namespace + " key: " + repr(key)) E KeyError: "namespace: urls key: {'url': 'https://publicsuffix.org/list/public_suffix_list.dat'}" /usr/lib/python3.11/site-packages/tldextract/cache.py:106: KeyError During handling of the above exception, another exception occurred: self = @vcr_integration_test def test_provider_when_calling_update_record_with_full_name_should_modify_record( self, ): provider = self._construct_authenticated_provider() > assert provider.create_record( "TXT", f"orig.testfull.{self.domain}", "challengetoken" ) lexicon/tests/providers/integration_tests.py:275: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ lexicon/providers/base.py:79: in create_record return self._create_record(rtype, name, content) lexicon/providers/namecheap.py:207: in _create_record self.client.domains_dns_add_host(self.domain, record) lexicon/providers/namecheap.py:411: in domains_dns_add_host host_records_remote = self.domains_dns_get_hosts(domain) lexicon/providers/namecheap.py:397: in domains_dns_get_hosts extracted = tldextract.extract(domain) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:366: in extract return TLD_EXTRACTOR(url, include_psl_private_domains=include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:219: in __call__ return self.extract_str(url, include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:234: in extract_str return self._extract_netloc(lenient_netloc(url), include_psl_private_domains) /usr/lib/python3.11/site-packages/tldextract/tldextract.py:267: in _extract_netloc suffix_index = self._get_tld_extractor().suffix_index( /usr/lib/python3.11/site-packages/tldextract/tldextract.py:309: in _get_tld_extractor public_tlds, private_tlds = get_suffix_lists( /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:67: in get_suffix_lists return cache.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:89: in _get_suffix_lists text = find_first_response(cache, urls, cache_fetch_timeout=cache_fetch_timeout) /usr/lib/python3.11/site-packages/tldextract/suffix_list.py:38: in find_first_response return cache.cached_fetch_url( /usr/lib/python3.11/site-packages/tldextract/cache.py:219: in cached_fetch_url return self.run_and_cache( /usr/lib/python3.11/site-packages/tldextract/cache.py:210: in run_and_cache result = func(**kwargs) /usr/lib/python3.11/site-packages/tldextract/cache.py:228: in _fetch_url response = session.get(url, timeout=timeout) /usr/lib/python3.11/site-packages/requests/sessions.py:600: in get return self.request("GET", url, **kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /usr/lib/python3.11/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) /usr/lib/python3.11/site-packages/requests/adapters.py:489: in send resp = conn.urlopen( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:703: in urlopen httplib_response = self._make_request( /usr/lib/python3.11/site-packages/urllib3/connectionpool.py:440: in _make_request httplib_response = conn.getresponse(buffering=True) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = _ = False, kwargs = {'buffering': True} def getresponse(self, _=False, **kwargs): """Retrieve the response""" # Check to see if the cassette has a response for this request. If so, # then return it if self.cassette.can_play_response_for(self._vcr_request): log.info("Playing response for {} from cassette".format(self._vcr_request)) response = self.cassette.play_response(self._vcr_request) return VCRHTTPResponse(response) else: if self.cassette.write_protected and self.cassette.filter_request(self._vcr_request): > raise CannotOverwriteExistingCassetteException( cassette=self.cassette, failed_request=self._vcr_request ) E vcr.errors.CannotOverwriteExistingCassetteException: Can't overwrite existing cassette ('tests/fixtures/cassettes/namecheap/managed-IntegrationTests/test_provider_when_calling_update_record_with_full_name_should_modify_record.yaml') in your current record mode ('none'). E No match for the request () was found. E Found 7 similar requests with 4 different matcher(s) : E E 1 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.getInfo'), ('DomainName', 'unittest-seconddomain.dev')] E E 2 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 3 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] E E 4 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 5 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 6 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.getHosts'), ('SLD', 'unittest-seconddomain'), ('TLD', 'dev')] E E 7 - (). E Matchers succeeded : ['scheme', 'port'] E Matchers failed : E method - assertion failure : E GET != POST E host - assertion failure : E publicsuffix.org != api.sandbox.namecheap.com E path - assertion failure : E /list/public_suffix_list.dat != /xml.response E query - assertion failure : E [] != [('ClientIP', '127.0.0.1'), ('Command', 'namecheap.domains.dns.setHosts')] /usr/lib/python3.11/site-packages/vcr/stubs/__init__.py:231: CannotOverwriteExistingCassetteException =============================== warnings summary =============================== ../../../../usr/lib/python3.11/site-packages/zeep/utils.py:1 /usr/lib/python3.11/site-packages/zeep/utils.py:1: DeprecationWarning: 'cgi' is deprecated and slated for removal in Python 3.13 import cgi -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info ============================ FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_authenticate FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_list_records_after_setting_ttl FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED lexicon/tests/providers/test_auto.py::AutoProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_create_record_for_A_with_valid_name_and_content FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_create_record_for_CNAME_with_valid_name_and_content FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_create_record_for_TXT_with_fqdn_name_and_content FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_create_record_for_TXT_with_full_name_and_content FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_create_record_for_TXT_with_valid_name_and_content FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_create_record_multiple_times_should_create_record_set FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_create_record_with_duplicate_records_should_be_noop FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_delete_record_by_filter_should_remove_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_delete_record_by_filter_with_fqdn_name_should_remove_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_delete_record_by_filter_with_full_name_should_remove_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_delete_record_by_identifier_should_remove_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_delete_record_with_record_set_by_content_should_leave_others_untouched FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_delete_record_with_record_set_name_remove_all FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_list_records_should_handle_record_sets FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_list_records_with_fqdn_name_filter_should_return_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_list_records_with_full_name_filter_should_return_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_list_records_with_invalid_filter_should_be_empty_list FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_list_records_with_name_filter_should_return_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_list_records_with_no_arguments_should_list_all FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_update_record_should_modify_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_update_record_should_modify_record_name_specified FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_update_record_with_fqdn_name_should_modify_record FAILED lexicon/tests/providers/test_namecheap.py::NamecheapManagedProviderTests::test_provider_when_calling_update_record_with_full_name_should_modify_record = 71 failed, 2070 passed, 246 skipped, 58 deselected, 1 warning in 682.26s (0:11:22) = ==> ERROR: A failure occurred in check().  Aborting... ==> ERROR: Build failed, check /var/lib/archbuild/extra-riscv64/root8/build receiving incremental file list dns-lexicon-3.12.0-1-riscv64-build.log dns-lexicon-3.12.0-1-riscv64-check.log sent 62 bytes received 20,782 bytes 41,688.00 bytes/sec total size is 1,044,180 speedup is 50.09